id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
abiyo27/BibleTTS-EWE | 2023-08-27T15:29:57.000Z | [
"license:cc-by-sa-4.0",
"region:us"
] | abiyo27 | null | null | null | 0 | 0 | ---
license: cc-by-sa-4.0
---
|
anonuseranonuser/Tutorbot-Spock-Bio-Dataset | 2023-08-26T21:59:30.000Z | [
"region:us"
] | anonuseranonuser | null | null | null | 0 | 0 | Entry not found |
Gamer6677/hi | 2023-08-26T22:11:35.000Z | [
"region:us"
] | Gamer6677 | null | null | null | 0 | 0 | Entry not found |
giganion/pdftext-to-latex | 2023-08-26T23:00:27.000Z | [
"region:us"
] | giganion | null | null | null | 0 | 0 | Entry not found |
yzhuang/autotree_nnxor_l1_2 | 2023-08-26T23:06:05.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence:
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 3086000000
num_examples: 100000
- name: validation
num_bytes: 308600000
num_examples: 10000
- name: test
num_bytes: 308600000
num_examples: 10000
download_size: 2064111198
dataset_size: 3703200000
---
# Dataset Card for "autotree_nnxor_l1_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AshtonIsNotHere/nlp_pp_code_dataset | 2023-08-26T23:07:50.000Z | [
"region:us"
] | AshtonIsNotHere | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2126529.0
num_examples: 1463
- name: test
num_bytes: 528817.0
num_examples: 258
download_size: 948983
dataset_size: 2655346.0
---
# Dataset Card for "nlp_pp_code_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
amigun/topblog-tg | 2023-08-26T23:29:40.000Z | [
"region:us"
] | amigun | null | null | null | 0 | 0 | Entry not found |
Immanoel/jimmymatanza | 2023-08-26T23:37:51.000Z | [
"license:other",
"region:us"
] | Immanoel | null | null | null | 0 | 0 | ---
license: other
---
|
mesolitica/google-translate-malaysian-pdf | 2023-08-27T06:50:04.000Z | [
"region:us"
] | mesolitica | null | null | null | 0 | 0 | Entry not found |
Samee-ur/guanaco-100 | 2023-08-26T23:56:38.000Z | [
"region:us"
] | Samee-ur | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 189498
num_examples: 100
download_size: 114615
dataset_size: 189498
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FASOXO/Saturn_CY | 2023-08-27T12:30:40.000Z | [
"region:us"
] | FASOXO | null | null | null | 0 | 0 | Entry not found |
Samee-ur/guanaco-1000 | 2023-08-27T01:46:37.000Z | [
"region:us"
] | Samee-ur | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1696735
num_examples: 1000
download_size: 0
dataset_size: 1696735
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ShideQwQ/RVC-Models | 2023-08-27T01:52:46.000Z | [
"region:us"
] | ShideQwQ | null | null | null | 0 | 0 | Entry not found |
HachiML/truthful_qa-ja-v0.3_forcheck | 2023-08-27T02:41:58.000Z | [
"region:us"
] | HachiML | null | null | null | 0 | 0 | ---
dataset_info:
config_name: generation
features:
- name: id
dtype: int64
- name: type
dtype: string
- name: category
dtype: string
- name: question
dtype: string
- name: best_answer
dtype: string
- name: correct_answers
sequence: string
- name: incorrect_answers
sequence: string
- name: source
dtype: string
- name: question_en
dtype: string
- name: best_answer_en
dtype: string
- name: correct_answers_en
sequence: string
- name: incorrect_answers_en
sequence: string
- name: meta
struct:
- name: kenlm_score
struct:
- name: best_answer
dtype: float64
- name: correct_answers
sequence: float64
- name: incorrect_answers
sequence: float64
- name: question
dtype: float64
splits:
- name: validation
num_bytes: 851812.0636474908
num_examples: 673
download_size: 442750
dataset_size: 851812.0636474908
configs:
- config_name: generation
data_files:
- split: validation
path: generation/validation-*
---
# Dataset Card for "truthful_qa-ja-v0.3_forcheck"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TinyPixel/s-data | 2023-08-27T02:47:28.000Z | [
"region:us"
] | TinyPixel | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 19983660
num_examples: 69374
download_size: 9875458
dataset_size: 19983660
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "s-data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
colt082295/article-json | 2023-08-27T03:09:29.000Z | [
"region:us"
] | colt082295 | null | null | null | 0 | 0 | Entry not found |
yzhuang/autotree_automl_heloc_gosdt_l512_d3 | 2023-08-27T03:13:13.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: int64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: int64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 11682400000
num_examples: 100000
- name: validation
num_bytes: 1168240000
num_examples: 10000
download_size: 1504688602
dataset_size: 12850640000
---
# Dataset Card for "autotree_automl_heloc_gosdt_l512_d3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
oscorrea/short-seo-descriptions | 2023-08-27T03:20:21.000Z | [
"region:us"
] | oscorrea | null | null | null | 0 | 0 | Entry not found |
HUBioDataLab/AlphafoldStructures | 2023-08-28T04:05:44.000Z | [
"region:us"
] | HUBioDataLab | null | null | null | 0 | 0 | Entry not found |
roszcz/maestro-quantized | 2023-08-27T03:44:55.000Z | [
"region:us"
] | roszcz | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: midi_filename
dtype: string
- name: source
dtype: string
- name: pitch
sequence: int16
length: 128
- name: dstart_bin
sequence: int16
length: 128
- name: duration_bin
sequence: int16
length: 128
- name: velocity_bin
sequence: int16
length: 128
splits:
- name: train
num_bytes: 57659609
num_examples: 43727
- name: validation
num_bytes: 6508816
num_examples: 4929
- name: test
num_bytes: 7526034
num_examples: 5695
download_size: 14221054
dataset_size: 71694459
---
# Dataset Card for "maestro-quantized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_nnxor_l1_54 | 2023-08-27T04:11:12.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence:
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 13735600000
num_examples: 100000
- name: validation
num_bytes: 1373560000
num_examples: 10000
- name: test
num_bytes: 1373560000
num_examples: 10000
download_size: 14863203173
dataset_size: 16482720000
---
# Dataset Card for "autotree_nnxor_l1_54"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Alterneko/bad_images | 2023-08-27T13:06:30.000Z | [
"region:us"
] | Alterneko | null | null | null | 0 | 0 | Entry not found |
shareAI/CodeChat | 2023-08-27T05:23:33.000Z | [
"license:openrail",
"region:us"
] | shareAI | null | null | null | 3 | 0 | ---
license: openrail
---
|
Mustain/fujiki_49k_japanese_dataset | 2023-09-02T09:44:41.000Z | [
"region:us"
] | Mustain | null | null | null | 0 | 0 | Entry not found |
kjkoo/kb_card_cover_letter | 2023-08-27T05:55:27.000Z | [
"region:us"
] | kjkoo | null | null | null | 0 | 0 | Entry not found |
GtQuik702/SWTrainer | 2023-08-27T06:25:32.000Z | [
"task_categories:text-generation",
"size_categories:100K<n<1M",
"language:en",
"license:wtfpl",
"not-for-all-audiences",
"region:us"
] | GtQuik702 | null | null | null | 0 | 0 | ---
license: wtfpl
task_categories:
- text-generation
language:
- en
tags:
- not-for-all-audiences
size_categories:
- 100K<n<1M
--- |
1080SQStudio/Test | 2023-08-27T06:45:35.000Z | [
"license:openrail",
"region:us"
] | 1080SQStudio | null | null | null | 0 | 0 | ---
license: openrail
---
|
yzhuang/autotree_automl_california_gosdt_l512_d3_sd2 | 2023-08-27T06:54:02.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 5948000000
num_examples: 100000
- name: validation
num_bytes: 594800000
num_examples: 10000
download_size: 2214686612
dataset_size: 6542800000
---
# Dataset Card for "autotree_automl_california_gosdt_l512_d3_sd2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_california_gosdt_l512_d3_sd1 | 2023-08-27T07:17:56.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 5948000000
num_examples: 100000
- name: validation
num_bytes: 594800000
num_examples: 10000
download_size: 2215272445
dataset_size: 6542800000
---
# Dataset Card for "autotree_automl_california_gosdt_l512_d3_sd1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
imone/D4RL | 2023-08-30T15:07:49.000Z | [
"task_categories:reinforcement-learning",
"license:apache-2.0",
"region:us"
] | imone | null | null | null | 2 | 0 | ---
license: apache-2.0
task_categories:
- reinforcement-learning
---
# D4RL Dataset on HuggingFace
This repository hosts the pre-downloaded [D4RL dataset](https://github.com/Farama-Foundation/D4RL) on HuggingFace. It is designed to provide accelerated data downloading for users, eliminating the need to download the dataset from scratch.
## Installation
To use this dataset, you need to clone it into your local `.d4rl` directory. Here are the steps to do so:
1. Navigate to your `.d4rl` directory:
```bash
cd ~/.d4rl
```
2. Clone the dataset repository from HuggingFace:
```bash
git clone https://huggingface.co/datasets/imone/D4RL datasets
```
After these steps, the D4RL dataset will be available for use with the `d4rl` package. |
managgiaate/gergefrgyuerf | 2023-08-27T07:34:40.000Z | [
"license:openrail",
"region:us"
] | managgiaate | null | null | null | 0 | 0 | ---
license: openrail
---
|
yzhuang/autotree_automl_california_gosdt_l512_d3_sd3 | 2023-08-27T07:39:46.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 5948000000
num_examples: 100000
- name: validation
num_bytes: 594800000
num_examples: 10000
download_size: 2222092293
dataset_size: 6542800000
---
# Dataset Card for "autotree_automl_california_gosdt_l512_d3_sd3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MohammadHarrisCallME/_NetProgrammingBasics | 2023-08-27T07:42:15.000Z | [
"license:llama2",
"region:us"
] | MohammadHarrisCallME | null | null | null | 0 | 0 | ---
license: llama2
---
|
igenius123/zgjm | 2023-08-27T08:58:49.000Z | [
"license:apache-2.0",
"region:us"
] | igenius123 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
ksabeh/openbrand | 2023-08-27T09:42:09.000Z | [
"region:us"
] | ksabeh | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: category
dtype: string
- name: title
dtype: string
- name: brand
dtype: string
- name: asin
dtype: string
- name: imageURL
dtype: string
- name: position_index
dtype: int64
- name: num_tokens
dtype: int64
- name: title_length
dtype: int64
- name: title_category
dtype: string
splits:
- name: train
num_bytes: 68007488
num_examples: 181551
- name: test
num_bytes: 18875793
num_examples: 50432
- name: automotive
num_bytes: 4523220
num_examples: 12891
- name: cellphones
num_bytes: 51882096
num_examples: 78478
- name: clothes
num_bytes: 37489496
num_examples: 85052
- name: electronics
num_bytes: 4820108
num_examples: 9568
- name: grocery
num_bytes: 1567047
num_examples: 4475
- name: new_cat
num_bytes: 93547671
num_examples: 174381
- name: pets
num_bytes: 4175961
num_examples: 10851
- name: sports
num_bytes: 3804172
num_examples: 10841
- name: toys
num_bytes: 4161246
num_examples: 12657
- name: val
num_bytes: 7583420
num_examples: 20172
download_size: 110231234
dataset_size: 300437718
---
# Dataset Card for "openbrand"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nc33/cross-encoder-law | 2023-08-27T09:17:39.000Z | [
"region:us"
] | nc33 | null | null | null | 0 | 0 | ---
dataset_info:
- config_name: train
features:
- name: __index_level_0__
dtype: 'null'
splits:
- name: train
download_size: 0
dataset_size: 0
- config_name: train1
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: label
dtype: int64
- name: id_ques
dtype: int64
- name: id_doc
dtype: int64
- name: FaQ
dtype: string
- name: full_answer
dtype: string
splits:
- name: train
num_bytes: 1179560276
num_examples: 400000
download_size: 462766037
dataset_size: 1179560276
- config_name: train2
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: label
dtype: int64
- name: id_ques
dtype: int64
- name: id_doc
dtype: int64
- name: FaQ
dtype: string
- name: full_answer
dtype: string
splits:
- name: train
num_bytes: 1179752828
num_examples: 400000
download_size: 462931159
dataset_size: 1179752828
- config_name: train3
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: label
dtype: int64
- name: id_ques
dtype: int64
- name: id_doc
dtype: int64
- name: FaQ
dtype: string
- name: full_answer
dtype: string
splits:
- name: train
num_bytes: 1159471217
num_examples: 392603
download_size: 454750083
dataset_size: 1159471217
configs:
- config_name: train
data_files:
- split: train
path: train/train-*
- config_name: train1
data_files:
- split: train
path: train1/train-*
- config_name: train2
data_files:
- split: train
path: train2/train-*
- config_name: train3
data_files:
- split: train
path: train3/train-*
---
# Dataset Card for "cross-encoder-law"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Pongsak/Pongsak | 2023-08-27T08:58:21.000Z | [
"region:us"
] | Pongsak | null | null | null | 0 | 0 | Entry not found |
hjf-utc/expert_law_dataset | 2023-08-28T13:47:24.000Z | [
"license:cc-by-nc-nd-4.0",
"region:us"
] | hjf-utc | null | null | null | 0 | 0 | ---
license: cc-by-nc-nd-4.0
---
---
language:
- de
---
# Law Topics Q&A Dataset - README
## Description
This repository contains a dataset of real questions and answers related to various law topics. The questions are sourced from real individuals, and the answers are provided by legal experts, who are qualified lawyers. The primary language of the dataset is German.
## Files
The dataset comes in three separate JSON Line files:
- `data_conversations_5k.jsonl`: Contains 5,000 conversations.
- `data_conversations_30k.jsonl`: Contains 30,000 conversations.
- `data_conversations_all_26082023.jsonl`: A complete dataset containing all conversations up to the date of August 26, 2023.
## Sample Row
The dataset is structured with the following columns:
- `question_title` (string): The title of the question
- `pricetag` (string): The cost associated with asking the question
- `expert_rating` (string): Rating given to the expert's answer, if any
- `date_question` (string): The date when the question was asked
- `area_of_law` (string): The field of law to which the question pertains
- `questions_user` (sequence): The list of questions from the user.
- `answers_expert` (sequence): The list of answers from the legal expert.
## Use Case
This dataset can be useful for researchers, students, and developers who are interested in:
- Legal NLP applications
- Training models for question-answering systems within the law domain
- Studying the structure and content of legal inquiries and expert responses
- Language translation services, specifically targeting legal topics
## Legal & Ethical Considerations
Please note that this dataset should not be used as a substitute for professional legal advice. The dataset is intended solely for educational and research purposes.
## License
This dataset is available under the cc-by-nc-nd-4.0 License.
## Sample Code to Load Dataset in Python
Here is a sample Python code snippet to load the dataset using the `jsonlines` library.
```python
import jsonlines
# Load 5k dataset
with jsonlines.open('data_conversations_5k.jsonl') as reader:
for obj in reader:
print(obj['question_title'])
print(obj['questions_user'])
print(obj['answers_expert'])
```
## Contributing
If you find any inconsistencies in the dataset or if you wish to contribute to this project, feel free to open a pull request or raise an issue.
## Contact Information
For any further questions or suggestions, please open an issue on this repository. |
hanamthang/classification | 2023-09-12T06:26:47.000Z | [
"region:us"
] | hanamthang | null | null | null | 0 | 0 | Entry not found |
GISY/product_try | 2023-08-27T09:32:02.000Z | [
"region:us"
] | GISY | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 54614221.0
num_examples: 47
download_size: 54592569
dataset_size: 54614221.0
---
# Dataset Card for "product_try"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SaiedAlshahrani/Arabic_Wikipedia_20230101_nobots | 2023-08-27T09:40:50.000Z | [
"region:us"
] | SaiedAlshahrani | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2805384689
num_examples: 1087947
download_size: 1107700539
dataset_size: 2805384689
---
# Dataset Card for "Arabic_Wikipedia_20230101_nobots"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
g-ronimo/lfpodcast | 2023-08-27T10:12:54.000Z | [
"region:us"
] | g-ronimo | null | null | null | 0 | 0 | Entry not found |
mangostin2010/KakaoChatData-alpaca | 2023-08-27T10:20:15.000Z | [
"license:other",
"region:us"
] | mangostin2010 | null | null | null | 0 | 0 | ---
license: other
---
|
thanhnew2001/taplamvan | 2023-08-27T10:44:00.000Z | [
"region:us"
] | thanhnew2001 | null | null | null | 0 | 0 | Entry not found |
Kuote/AI | 2023-08-27T10:46:21.000Z | [
"region:us"
] | Kuote | null | null | null | 0 | 0 | Entry not found |
ksabeh/openbrand-zs | 2023-08-28T20:14:31.000Z | [
"region:us"
] | ksabeh | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: category
dtype: string
- name: title
dtype: string
- name: brand
dtype: string
- name: asin
dtype: string
- name: imageURL
dtype: string
- name: position_index
dtype: int64
- name: num_tokens
dtype: int64
- name: title_length
dtype: int64
- name: title_category
dtype: string
splits:
- name: train
num_bytes: 24211621
num_examples: 61075
- name: val
num_bytes: 2685833
num_examples: 6788
- name: test
num_bytes: 9453851
num_examples: 25221
- name: electronics
num_bytes: 2423259
num_examples: 4786
- name: sports
num_bytes: 1904597
num_examples: 5420
- name: toys
num_bytes: 2078207
num_examples: 6329
- name: automotive
num_bytes: 2271017
num_examples: 6446
- name: grocery
num_bytes: 776771
num_examples: 2240
download_size: 13092616
dataset_size: 45805156
---
# Dataset Card for "openbrand-zs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vsrinivas/llamini_docs_splitdata | 2023-08-27T10:59:05.000Z | [
"region:us"
] | vsrinivas | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1846734.3
num_examples: 1260
- name: test
num_bytes: 205192.7
num_examples: 140
download_size: 695218
dataset_size: 2051927.0
---
# Dataset Card for "llamini_docs_splitdata"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Alterneko/worst | 2023-08-28T01:03:18.000Z | [
"region:us"
] | Alterneko | null | null | null | 0 | 0 | Entry not found |
ecccho/pixiv-novel-aesthetics | 2023-08-29T13:54:19.000Z | [
"region:us"
] | ecccho | null | null | null | 2 | 0 | # R18 novels chosen from pixiv
Chinese language
For every file in dataset,
First line is Unix timestamp.
Second line is novel stat.
Third line is novel content.
## different versions of dataset
- **aesthetic_2023_8_27**: novels chosen from bookmarks collected in Aug 27th, 2023
- **toplist_2023_8_29**: novels from daily toplist 2020-2023, collected in Aug 29th, 2023
There are no Chinese novels before 2020, maybe they are all deleted, or there are literally no Chinese novels in that early time.
|
open-llm-leaderboard/details_bigcode-data__pile-1.3b | 2023-08-27T12:43:44.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of bigcode-data/pile-1.3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bigcode-data/pile-1.3b](https://huggingface.co/bigcode-data/pile-1.3b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bigcode-data__pile-1.3b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-27T11:45:25.415684](https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode-data__pile-1.3b/blob/main/results_2023-08-27T11%3A45%3A25.415684.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2670925459145178,\n\
\ \"acc_stderr\": 0.0321126082440487,\n \"acc_norm\": 0.26951995132086415,\n\
\ \"acc_norm_stderr\": 0.03212015072555486,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931578,\n \"mc2\": 0.3982550193068694,\n\
\ \"mc2_stderr\": 0.01422499198673612\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.28668941979522183,\n \"acc_stderr\": 0.013214986329274763,\n\
\ \"acc_norm\": 0.31399317406143346,\n \"acc_norm_stderr\": 0.013562691224726286\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4004182433778132,\n\
\ \"acc_stderr\": 0.004889817489739691,\n \"acc_norm\": 0.5163314080860386,\n\
\ \"acc_norm_stderr\": 0.004987119003151493\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.03785714465066655,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.03785714465066655\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.036906779861372814,\n\
\ \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.036906779861372814\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.02713429162874169,\n\
\ \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.02713429162874169\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3583815028901734,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.3583815028901734,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808779,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808779\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.044084400227680814,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.044084400227680814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.027678452578212387,\n\
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.027678452578212387\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n\
\ \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577656,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577656\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.038095238095238126,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.038095238095238126\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.267741935483871,\n \"acc_stderr\": 0.025189006660212385,\n \"\
acc_norm\": 0.267741935483871,\n \"acc_norm_stderr\": 0.025189006660212385\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114475,\n \"\
acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114475\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.296969696969697,\n \"acc_stderr\": 0.03567969772268047,\n\
\ \"acc_norm\": 0.296969696969697,\n \"acc_norm_stderr\": 0.03567969772268047\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3181818181818182,\n \"acc_stderr\": 0.03318477333845331,\n \"\
acc_norm\": 0.3181818181818182,\n \"acc_norm_stderr\": 0.03318477333845331\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.034801756684660366,\n\
\ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.034801756684660366\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.27692307692307694,\n \"acc_stderr\": 0.022688042352424994,\n\
\ \"acc_norm\": 0.27692307692307694,\n \"acc_norm_stderr\": 0.022688042352424994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.026202766534652148,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.026202766534652148\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22201834862385322,\n \"acc_stderr\": 0.017818849564796624,\n \"\
acc_norm\": 0.22201834862385322,\n \"acc_norm_stderr\": 0.017818849564796624\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.27314814814814814,\n \"acc_stderr\": 0.03038805130167812,\n \"\
acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.03038805130167812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.29957805907172996,\n \"acc_stderr\": 0.029818024749753102,\n\
\ \"acc_norm\": 0.29957805907172996,\n \"acc_norm_stderr\": 0.029818024749753102\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.17488789237668162,\n\
\ \"acc_stderr\": 0.02549528462644497,\n \"acc_norm\": 0.17488789237668162,\n\
\ \"acc_norm_stderr\": 0.02549528462644497\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.1984732824427481,\n \"acc_stderr\": 0.0349814938546247,\n\
\ \"acc_norm\": 0.1984732824427481,\n \"acc_norm_stderr\": 0.0349814938546247\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.294478527607362,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.294478527607362,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n\
\ \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n\
\ \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\
\ \"acc_stderr\": 0.02860595370200425,\n \"acc_norm\": 0.2564102564102564,\n\
\ \"acc_norm_stderr\": 0.02860595370200425\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150195,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150195\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.014288343803925324,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.014288343803925324\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.26143790849673204,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.28938906752411575,\n\
\ \"acc_stderr\": 0.025755865922632924,\n \"acc_norm\": 0.28938906752411575,\n\
\ \"acc_norm_stderr\": 0.025755865922632924\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.02465968518596728,\n\
\ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.02465968518596728\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590638,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590638\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25488917861799215,\n\
\ \"acc_stderr\": 0.011130509812662977,\n \"acc_norm\": 0.25488917861799215,\n\
\ \"acc_norm_stderr\": 0.011130509812662977\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2426470588235294,\n \"acc_stderr\": 0.02604066247420126,\n\
\ \"acc_norm\": 0.2426470588235294,\n \"acc_norm_stderr\": 0.02604066247420126\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2696078431372549,\n \"acc_stderr\": 0.017952449196987866,\n \
\ \"acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.017952449196987866\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.038313051408846034,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.038313051408846034\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n\
\ \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.03036049015401467,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.03036049015401467\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2289156626506024,\n\
\ \"acc_stderr\": 0.03270745277352477,\n \"acc_norm\": 0.2289156626506024,\n\
\ \"acc_norm_stderr\": 0.03270745277352477\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.23391812865497075,\n \"acc_stderr\": 0.03246721765117825,\n\
\ \"acc_norm\": 0.23391812865497075,\n \"acc_norm_stderr\": 0.03246721765117825\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931578,\n \"mc2\": 0.3982550193068694,\n\
\ \"mc2_stderr\": 0.01422499198673612\n }\n}\n```"
repo_url: https://huggingface.co/bigcode-data/pile-1.3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|arc:challenge|25_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hellaswag|10_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-27T11:45:25.415684.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-27T11:45:25.415684.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-27T11:45:25.415684.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-27T11:45:25.415684.parquet'
- config_name: results
data_files:
- split: 2023_08_27T11_45_25.415684
path:
- results_2023-08-27T11:45:25.415684.parquet
- split: latest
path:
- results_2023-08-27T11:45:25.415684.parquet
---
# Dataset Card for Evaluation run of bigcode-data/pile-1.3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bigcode-data/pile-1.3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [bigcode-data/pile-1.3b](https://huggingface.co/bigcode-data/pile-1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bigcode-data__pile-1.3b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-27T11:45:25.415684](https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode-data__pile-1.3b/blob/main/results_2023-08-27T11%3A45%3A25.415684.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2670925459145178,
"acc_stderr": 0.0321126082440487,
"acc_norm": 0.26951995132086415,
"acc_norm_stderr": 0.03212015072555486,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931578,
"mc2": 0.3982550193068694,
"mc2_stderr": 0.01422499198673612
},
"harness|arc:challenge|25": {
"acc": 0.28668941979522183,
"acc_stderr": 0.013214986329274763,
"acc_norm": 0.31399317406143346,
"acc_norm_stderr": 0.013562691224726286
},
"harness|hellaswag|10": {
"acc": 0.4004182433778132,
"acc_stderr": 0.004889817489739691,
"acc_norm": 0.5163314080860386,
"acc_norm_stderr": 0.004987119003151493
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066655,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066655
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.036906779861372814,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.036906779861372814
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2641509433962264,
"acc_stderr": 0.02713429162874169,
"acc_norm": 0.2641509433962264,
"acc_norm_stderr": 0.02713429162874169
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3583815028901734,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.3583815028901734,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808779,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808779
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680814,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.027678452578212387,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.027678452578212387
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577656,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577656
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.038095238095238126,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.038095238095238126
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.267741935483871,
"acc_stderr": 0.025189006660212385,
"acc_norm": 0.267741935483871,
"acc_norm_stderr": 0.025189006660212385
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114475,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114475
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.296969696969697,
"acc_stderr": 0.03567969772268047,
"acc_norm": 0.296969696969697,
"acc_norm_stderr": 0.03567969772268047
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.03318477333845331,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.03318477333845331
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.034801756684660366,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.034801756684660366
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.27692307692307694,
"acc_stderr": 0.022688042352424994,
"acc_norm": 0.27692307692307694,
"acc_norm_stderr": 0.022688042352424994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.026202766534652148,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.026202766534652148
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22201834862385322,
"acc_stderr": 0.017818849564796624,
"acc_norm": 0.22201834862385322,
"acc_norm_stderr": 0.017818849564796624
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.27314814814814814,
"acc_stderr": 0.03038805130167812,
"acc_norm": 0.27314814814814814,
"acc_norm_stderr": 0.03038805130167812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.29957805907172996,
"acc_stderr": 0.029818024749753102,
"acc_norm": 0.29957805907172996,
"acc_norm_stderr": 0.029818024749753102
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.17488789237668162,
"acc_stderr": 0.02549528462644497,
"acc_norm": 0.17488789237668162,
"acc_norm_stderr": 0.02549528462644497
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.1984732824427481,
"acc_stderr": 0.0349814938546247,
"acc_norm": 0.1984732824427481,
"acc_norm_stderr": 0.0349814938546247
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.294478527607362,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.294478527607362,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.02860595370200425,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.02860595370200425
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150195,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150195
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.014288343803925324,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.014288343803925324
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.28938906752411575,
"acc_stderr": 0.025755865922632924,
"acc_norm": 0.28938906752411575,
"acc_norm_stderr": 0.025755865922632924
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.02465968518596728,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.02465968518596728
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590638,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590638
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25488917861799215,
"acc_stderr": 0.011130509812662977,
"acc_norm": 0.25488917861799215,
"acc_norm_stderr": 0.011130509812662977
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2426470588235294,
"acc_stderr": 0.02604066247420126,
"acc_norm": 0.2426470588235294,
"acc_norm_stderr": 0.02604066247420126
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.017952449196987866,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.017952449196987866
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.038313051408846034,
"acc_norm": 0.2,
"acc_norm_stderr": 0.038313051408846034
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401467,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401467
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2289156626506024,
"acc_stderr": 0.03270745277352477,
"acc_norm": 0.2289156626506024,
"acc_norm_stderr": 0.03270745277352477
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23391812865497075,
"acc_stderr": 0.03246721765117825,
"acc_norm": 0.23391812865497075,
"acc_norm_stderr": 0.03246721765117825
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931578,
"mc2": 0.3982550193068694,
"mc2_stderr": 0.01422499198673612
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
thomwolf/test-1.1 | 2023-08-27T11:46:25.000Z | [
"region:us"
] | thomwolf | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-chat-lora-v3 | 2023-09-16T22:22:16.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of lvkaokao/llama2-7b-hf-chat-lora-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lvkaokao/llama2-7b-hf-chat-lora-v3](https://huggingface.co/lvkaokao/llama2-7b-hf-chat-lora-v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-chat-lora-v3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T22:22:04.429370](https://huggingface.co/datasets/open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-chat-lora-v3/blob/main/results_2023-09-16T22-22-04.429370.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0026216442953020135,\n\
\ \"em_stderr\": 0.0005236685642966032,\n \"f1\": 0.05310088087248333,\n\
\ \"f1_stderr\": 0.0014130017638603535,\n \"acc\": 0.3891916037418029,\n\
\ \"acc_stderr\": 0.007656807657466876\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0026216442953020135,\n \"em_stderr\": 0.0005236685642966032,\n\
\ \"f1\": 0.05310088087248333,\n \"f1_stderr\": 0.0014130017638603535\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.015163002274450341,\n \
\ \"acc_stderr\": 0.0033660229497263472\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7632202052091555,\n \"acc_stderr\": 0.011947592365207404\n\
\ }\n}\n```"
repo_url: https://huggingface.co/lvkaokao/llama2-7b-hf-chat-lora-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|arc:challenge|25_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T22_22_04.429370
path:
- '**/details_harness|drop|3_2023-09-16T22-22-04.429370.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T22-22-04.429370.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T22_22_04.429370
path:
- '**/details_harness|gsm8k|5_2023-09-16T22-22-04.429370.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T22-22-04.429370.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hellaswag|10_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T22_22_04.429370
path:
- '**/details_harness|winogrande|5_2023-09-16T22-22-04.429370.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T22-22-04.429370.parquet'
- config_name: results
data_files:
- split: 2023_09_16T22_22_04.429370
path:
- results_2023-09-16T22-22-04.429370.parquet
- split: latest
path:
- results_2023-09-16T22-22-04.429370.parquet
---
# Dataset Card for Evaluation run of lvkaokao/llama2-7b-hf-chat-lora-v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lvkaokao/llama2-7b-hf-chat-lora-v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lvkaokao/llama2-7b-hf-chat-lora-v3](https://huggingface.co/lvkaokao/llama2-7b-hf-chat-lora-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-chat-lora-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T22:22:04.429370](https://huggingface.co/datasets/open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-chat-lora-v3/blob/main/results_2023-09-16T22-22-04.429370.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642966032,
"f1": 0.05310088087248333,
"f1_stderr": 0.0014130017638603535,
"acc": 0.3891916037418029,
"acc_stderr": 0.007656807657466876
},
"harness|drop|3": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642966032,
"f1": 0.05310088087248333,
"f1_stderr": 0.0014130017638603535
},
"harness|gsm8k|5": {
"acc": 0.015163002274450341,
"acc_stderr": 0.0033660229497263472
},
"harness|winogrande|5": {
"acc": 0.7632202052091555,
"acc_stderr": 0.011947592365207404
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_migtissera__Synthia-70B | 2023-08-27T12:43:48.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of migtissera/Synthia-70B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [migtissera/Synthia-70B](https://huggingface.co/migtissera/Synthia-70B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Synthia-70B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-23T05:19:54.133935](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-70B/blob/main/results_2023-08-23T05%3A19%3A54.133935.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6884676888614354,\n\
\ \"acc_stderr\": 0.03140279617853231,\n \"acc_norm\": 0.6922701370438118,\n\
\ \"acc_norm_stderr\": 0.03137403065384253,\n \"mc1\": 0.43451652386780903,\n\
\ \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.5978847833710849,\n\
\ \"mc2_stderr\": 0.014931476744751782\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.658703071672355,\n \"acc_stderr\": 0.013855831287497723,\n\
\ \"acc_norm\": 0.6945392491467577,\n \"acc_norm_stderr\": 0.013460080478002503\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6826329416450906,\n\
\ \"acc_stderr\": 0.004645003662067883,\n \"acc_norm\": 0.8711412069308903,\n\
\ \"acc_norm_stderr\": 0.003343588514866123\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7960526315789473,\n \"acc_stderr\": 0.0327900040631005,\n\
\ \"acc_norm\": 0.7960526315789473,\n \"acc_norm_stderr\": 0.0327900040631005\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03309615177059007,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03309615177059007\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.03692820767264866,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.03692820767264866\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6468085106382979,\n \"acc_stderr\": 0.031245325202761926,\n\
\ \"acc_norm\": 0.6468085106382979,\n \"acc_norm_stderr\": 0.031245325202761926\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266237,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266237\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43915343915343913,\n \"acc_stderr\": 0.02555992055053101,\n \"\
acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.02555992055053101\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8064516129032258,\n \"acc_stderr\": 0.022475258525536057,\n \"\
acc_norm\": 0.8064516129032258,\n \"acc_norm_stderr\": 0.022475258525536057\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\"\
: 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.030117688929503592,\n\
\ \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.030117688929503592\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8585858585858586,\n \"acc_stderr\": 0.024825909793343336,\n \"\
acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.024825909793343336\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678175,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678175\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6871794871794872,\n \"acc_stderr\": 0.023507579020645365,\n\
\ \"acc_norm\": 0.6871794871794872,\n \"acc_norm_stderr\": 0.023507579020645365\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.02904560029061626,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.02904560029061626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7563025210084033,\n \"acc_stderr\": 0.027886828078380575,\n\
\ \"acc_norm\": 0.7563025210084033,\n \"acc_norm_stderr\": 0.027886828078380575\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"\
acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8880733944954129,\n \"acc_stderr\": 0.013517352714958788,\n \"\
acc_norm\": 0.8880733944954129,\n \"acc_norm_stderr\": 0.013517352714958788\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5833333333333334,\n \"acc_stderr\": 0.033622774366080424,\n \"\
acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.033622774366080424\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"\
acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.869198312236287,\n \"acc_stderr\": 0.02194876605947076,\n \
\ \"acc_norm\": 0.869198312236287,\n \"acc_norm_stderr\": 0.02194876605947076\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8161434977578476,\n\
\ \"acc_stderr\": 0.025998379092356513,\n \"acc_norm\": 0.8161434977578476,\n\
\ \"acc_norm_stderr\": 0.025998379092356513\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476078,\n\
\ \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476078\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807194,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807194\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.03157065078911901,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.03157065078911901\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5535714285714286,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.5535714285714286,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822583,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822583\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n\
\ \"acc_stderr\": 0.017893784904018533,\n \"acc_norm\": 0.9188034188034188,\n\
\ \"acc_norm_stderr\": 0.017893784904018533\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8722860791826309,\n\
\ \"acc_stderr\": 0.011935626313999878,\n \"acc_norm\": 0.8722860791826309,\n\
\ \"acc_norm_stderr\": 0.011935626313999878\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7716763005780347,\n \"acc_stderr\": 0.022598703804321628,\n\
\ \"acc_norm\": 0.7716763005780347,\n \"acc_norm_stderr\": 0.022598703804321628\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43910614525139663,\n\
\ \"acc_stderr\": 0.016598022120580428,\n \"acc_norm\": 0.43910614525139663,\n\
\ \"acc_norm_stderr\": 0.016598022120580428\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.024954184324879905,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.024954184324879905\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7588424437299035,\n\
\ \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.7588424437299035,\n\
\ \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.808641975308642,\n \"acc_stderr\": 0.02188770461339615,\n\
\ \"acc_norm\": 0.808641975308642,\n \"acc_norm_stderr\": 0.02188770461339615\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5425531914893617,\n \"acc_stderr\": 0.029719281272236837,\n \
\ \"acc_norm\": 0.5425531914893617,\n \"acc_norm_stderr\": 0.029719281272236837\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5528031290743155,\n\
\ \"acc_stderr\": 0.012698825252435117,\n \"acc_norm\": 0.5528031290743155,\n\
\ \"acc_norm_stderr\": 0.012698825252435117\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7132352941176471,\n \"acc_stderr\": 0.027472274473233815,\n\
\ \"acc_norm\": 0.7132352941176471,\n \"acc_norm_stderr\": 0.027472274473233815\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7516339869281046,\n \"acc_stderr\": 0.017479487001364764,\n \
\ \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.017479487001364764\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7795918367346939,\n \"acc_stderr\": 0.026537045312145298,\n\
\ \"acc_norm\": 0.7795918367346939,\n \"acc_norm_stderr\": 0.026537045312145298\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101716,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101716\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n\
\ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.43451652386780903,\n\
\ \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.5978847833710849,\n\
\ \"mc2_stderr\": 0.014931476744751782\n }\n}\n```"
repo_url: https://huggingface.co/migtissera/Synthia-70B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|arc:challenge|25_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hellaswag|10_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T05:19:54.133935.parquet'
---
# Dataset Card for Evaluation run of migtissera/Synthia-70B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/migtissera/Synthia-70B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [migtissera/Synthia-70B](https://huggingface.co/migtissera/Synthia-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__Synthia-70B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-23T05:19:54.133935](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-70B/blob/main/results_2023-08-23T05%3A19%3A54.133935.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6884676888614354,
"acc_stderr": 0.03140279617853231,
"acc_norm": 0.6922701370438118,
"acc_norm_stderr": 0.03137403065384253,
"mc1": 0.43451652386780903,
"mc1_stderr": 0.017352738749259564,
"mc2": 0.5978847833710849,
"mc2_stderr": 0.014931476744751782
},
"harness|arc:challenge|25": {
"acc": 0.658703071672355,
"acc_stderr": 0.013855831287497723,
"acc_norm": 0.6945392491467577,
"acc_norm_stderr": 0.013460080478002503
},
"harness|hellaswag|10": {
"acc": 0.6826329416450906,
"acc_stderr": 0.004645003662067883,
"acc_norm": 0.8711412069308903,
"acc_norm_stderr": 0.003343588514866123
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7960526315789473,
"acc_stderr": 0.0327900040631005,
"acc_norm": 0.7960526315789473,
"acc_norm_stderr": 0.0327900040631005
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03309615177059007,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03309615177059007
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.03692820767264866,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.03692820767264866
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6468085106382979,
"acc_stderr": 0.031245325202761926,
"acc_norm": 0.6468085106382979,
"acc_norm_stderr": 0.031245325202761926
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266237,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266237
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.02555992055053101,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.02555992055053101
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.030117688929503592,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.030117688929503592
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8585858585858586,
"acc_stderr": 0.024825909793343336,
"acc_norm": 0.8585858585858586,
"acc_norm_stderr": 0.024825909793343336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678175,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678175
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6871794871794872,
"acc_stderr": 0.023507579020645365,
"acc_norm": 0.6871794871794872,
"acc_norm_stderr": 0.023507579020645365
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.02904560029061626,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.02904560029061626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7563025210084033,
"acc_stderr": 0.027886828078380575,
"acc_norm": 0.7563025210084033,
"acc_norm_stderr": 0.027886828078380575
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4768211920529801,
"acc_stderr": 0.04078093859163083,
"acc_norm": 0.4768211920529801,
"acc_norm_stderr": 0.04078093859163083
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8880733944954129,
"acc_stderr": 0.013517352714958788,
"acc_norm": 0.8880733944954129,
"acc_norm_stderr": 0.013517352714958788
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.033622774366080424,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.033622774366080424
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813905,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813905
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.869198312236287,
"acc_stderr": 0.02194876605947076,
"acc_norm": 0.869198312236287,
"acc_norm_stderr": 0.02194876605947076
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8161434977578476,
"acc_stderr": 0.025998379092356513,
"acc_norm": 0.8161434977578476,
"acc_norm_stderr": 0.025998379092356513
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.03336820338476078,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.03336820338476078
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807194,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807194
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.03157065078911901,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.03157065078911901
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5535714285714286,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.5535714285714286,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822583,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822583
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9188034188034188,
"acc_stderr": 0.017893784904018533,
"acc_norm": 0.9188034188034188,
"acc_norm_stderr": 0.017893784904018533
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8722860791826309,
"acc_stderr": 0.011935626313999878,
"acc_norm": 0.8722860791826309,
"acc_norm_stderr": 0.011935626313999878
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7716763005780347,
"acc_stderr": 0.022598703804321628,
"acc_norm": 0.7716763005780347,
"acc_norm_stderr": 0.022598703804321628
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43910614525139663,
"acc_stderr": 0.016598022120580428,
"acc_norm": 0.43910614525139663,
"acc_norm_stderr": 0.016598022120580428
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.024954184324879905,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.024954184324879905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7588424437299035,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.7588424437299035,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.808641975308642,
"acc_stderr": 0.02188770461339615,
"acc_norm": 0.808641975308642,
"acc_norm_stderr": 0.02188770461339615
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5425531914893617,
"acc_stderr": 0.029719281272236837,
"acc_norm": 0.5425531914893617,
"acc_norm_stderr": 0.029719281272236837
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5528031290743155,
"acc_stderr": 0.012698825252435117,
"acc_norm": 0.5528031290743155,
"acc_norm_stderr": 0.012698825252435117
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7132352941176471,
"acc_stderr": 0.027472274473233815,
"acc_norm": 0.7132352941176471,
"acc_norm_stderr": 0.027472274473233815
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7795918367346939,
"acc_stderr": 0.026537045312145298,
"acc_norm": 0.7795918367346939,
"acc_norm_stderr": 0.026537045312145298
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101716,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101716
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.43451652386780903,
"mc1_stderr": 0.017352738749259564,
"mc2": 0.5978847833710849,
"mc2_stderr": 0.014931476744751782
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_gaodrew__gaodrew-gorgonzola-13b | 2023-09-23T16:30:32.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of gaodrew/gaodrew-gorgonzola-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [gaodrew/gaodrew-gorgonzola-13b](https://huggingface.co/gaodrew/gaodrew-gorgonzola-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gaodrew__gaodrew-gorgonzola-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T16:30:20.571069](https://huggingface.co/datasets/open-llm-leaderboard/details_gaodrew__gaodrew-gorgonzola-13b/blob/main/results_2023-09-23T16-30-20.571069.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.04121224832214765,\n\
\ \"em_stderr\": 0.0020357012531483946,\n \"f1\": 0.13222525167785196,\n\
\ \"f1_stderr\": 0.002559817666324549,\n \"acc\": 0.4265177812231289,\n\
\ \"acc_stderr\": 0.010193838735770604\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.04121224832214765,\n \"em_stderr\": 0.0020357012531483946,\n\
\ \"f1\": 0.13222525167785196,\n \"f1_stderr\": 0.002559817666324549\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10007581501137225,\n \
\ \"acc_stderr\": 0.008266274528685634\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7529597474348856,\n \"acc_stderr\": 0.012121402942855573\n\
\ }\n}\n```"
repo_url: https://huggingface.co/gaodrew/gaodrew-gorgonzola-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|arc:challenge|25_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_21T04_07_38.110729
path:
- '**/details_harness|drop|3_2023-09-21T04-07-38.110729.parquet'
- split: 2023_09_23T16_30_20.571069
path:
- '**/details_harness|drop|3_2023-09-23T16-30-20.571069.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T16-30-20.571069.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_21T04_07_38.110729
path:
- '**/details_harness|gsm8k|5_2023-09-21T04-07-38.110729.parquet'
- split: 2023_09_23T16_30_20.571069
path:
- '**/details_harness|gsm8k|5_2023-09-23T16-30-20.571069.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T16-30-20.571069.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hellaswag|10_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T21:07:18.787653.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T21_07_18.787653
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T21:07:18.787653.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T21:07:18.787653.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_21T04_07_38.110729
path:
- '**/details_harness|winogrande|5_2023-09-21T04-07-38.110729.parquet'
- split: 2023_09_23T16_30_20.571069
path:
- '**/details_harness|winogrande|5_2023-09-23T16-30-20.571069.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T16-30-20.571069.parquet'
- config_name: results
data_files:
- split: 2023_09_21T04_07_38.110729
path:
- results_2023-09-21T04-07-38.110729.parquet
- split: 2023_09_23T16_30_20.571069
path:
- results_2023-09-23T16-30-20.571069.parquet
- split: latest
path:
- results_2023-09-23T16-30-20.571069.parquet
---
# Dataset Card for Evaluation run of gaodrew/gaodrew-gorgonzola-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/gaodrew/gaodrew-gorgonzola-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [gaodrew/gaodrew-gorgonzola-13b](https://huggingface.co/gaodrew/gaodrew-gorgonzola-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gaodrew__gaodrew-gorgonzola-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T16:30:20.571069](https://huggingface.co/datasets/open-llm-leaderboard/details_gaodrew__gaodrew-gorgonzola-13b/blob/main/results_2023-09-23T16-30-20.571069.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.04121224832214765,
"em_stderr": 0.0020357012531483946,
"f1": 0.13222525167785196,
"f1_stderr": 0.002559817666324549,
"acc": 0.4265177812231289,
"acc_stderr": 0.010193838735770604
},
"harness|drop|3": {
"em": 0.04121224832214765,
"em_stderr": 0.0020357012531483946,
"f1": 0.13222525167785196,
"f1_stderr": 0.002559817666324549
},
"harness|gsm8k|5": {
"acc": 0.10007581501137225,
"acc_stderr": 0.008266274528685634
},
"harness|winogrande|5": {
"acc": 0.7529597474348856,
"acc_stderr": 0.012121402942855573
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_gaodrew__gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps | 2023-10-01T22:04:55.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of gaodrew/gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [gaodrew/gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps](https://huggingface.co/gaodrew/gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gaodrew__gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-01T22:04:43.928284](https://huggingface.co/datasets/open-llm-leaderboard/details_gaodrew__gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps/blob/main/results_2023-10-01T22-04-43.928284.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0036703020134228187,\n\
\ \"em_stderr\": 0.0006192871806511078,\n \"f1\": 0.07745071308724842,\n\
\ \"f1_stderr\": 0.0016031429592015417,\n \"acc\": 0.4924286713583812,\n\
\ \"acc_stderr\": 0.01078503608525705\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0036703020134228187,\n \"em_stderr\": 0.0006192871806511078,\n\
\ \"f1\": 0.07745071308724842,\n \"f1_stderr\": 0.0016031429592015417\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17664897649734648,\n \
\ \"acc_stderr\": 0.01050486250585457\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8082083662194159,\n \"acc_stderr\": 0.011065209664659527\n\
\ }\n}\n```"
repo_url: https://huggingface.co/gaodrew/gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|arc:challenge|25_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_01T22_04_43.928284
path:
- '**/details_harness|drop|3_2023-10-01T22-04-43.928284.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-01T22-04-43.928284.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_01T22_04_43.928284
path:
- '**/details_harness|gsm8k|5_2023-10-01T22-04-43.928284.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-01T22-04-43.928284.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hellaswag|10_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_01T22_04_43.928284
path:
- '**/details_harness|winogrande|5_2023-10-01T22-04-43.928284.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-01T22-04-43.928284.parquet'
- config_name: results
data_files:
- split: 2023_10_01T22_04_43.928284
path:
- results_2023-10-01T22-04-43.928284.parquet
- split: latest
path:
- results_2023-10-01T22-04-43.928284.parquet
---
# Dataset Card for Evaluation run of gaodrew/gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/gaodrew/gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [gaodrew/gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps](https://huggingface.co/gaodrew/gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gaodrew__gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T22:04:43.928284](https://huggingface.co/datasets/open-llm-leaderboard/details_gaodrew__gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps/blob/main/results_2023-10-01T22-04-43.928284.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0036703020134228187,
"em_stderr": 0.0006192871806511078,
"f1": 0.07745071308724842,
"f1_stderr": 0.0016031429592015417,
"acc": 0.4924286713583812,
"acc_stderr": 0.01078503608525705
},
"harness|drop|3": {
"em": 0.0036703020134228187,
"em_stderr": 0.0006192871806511078,
"f1": 0.07745071308724842,
"f1_stderr": 0.0016031429592015417
},
"harness|gsm8k|5": {
"acc": 0.17664897649734648,
"acc_stderr": 0.01050486250585457
},
"harness|winogrande|5": {
"acc": 0.8082083662194159,
"acc_stderr": 0.011065209664659527
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_NousResearch__Nous-Puffin-70B | 2023-09-23T17:20:11.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of NousResearch/Nous-Puffin-70B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NousResearch/Nous-Puffin-70B](https://huggingface.co/NousResearch/Nous-Puffin-70B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NousResearch__Nous-Puffin-70B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T17:19:58.299008](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__Nous-Puffin-70B/blob/main/results_2023-09-23T17-19-58.299008.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0019924496644295304,\n\
\ \"em_stderr\": 0.00045666764626670005,\n \"f1\": 0.06601090604026844,\n\
\ \"f1_stderr\": 0.001371965767363261,\n \"acc\": 0.5908367954724018,\n\
\ \"acc_stderr\": 0.011701371531806812\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.00045666764626670005,\n\
\ \"f1\": 0.06601090604026844,\n \"f1_stderr\": 0.001371965767363261\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.34268385140257773,\n \
\ \"acc_stderr\": 0.01307303023082791\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8389897395422258,\n \"acc_stderr\": 0.010329712832785715\n\
\ }\n}\n```"
repo_url: https://huggingface.co/NousResearch/Nous-Puffin-70B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|arc:challenge|25_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T17_19_58.299008
path:
- '**/details_harness|drop|3_2023-09-23T17-19-58.299008.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T17-19-58.299008.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T17_19_58.299008
path:
- '**/details_harness|gsm8k|5_2023-09-23T17-19-58.299008.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T17-19-58.299008.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hellaswag|10_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T17:45:27.892102.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_24T17_45_27.892102
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T17:45:27.892102.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T17:45:27.892102.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T17_19_58.299008
path:
- '**/details_harness|winogrande|5_2023-09-23T17-19-58.299008.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T17-19-58.299008.parquet'
- config_name: results
data_files:
- split: 2023_09_23T17_19_58.299008
path:
- results_2023-09-23T17-19-58.299008.parquet
- split: latest
path:
- results_2023-09-23T17-19-58.299008.parquet
---
# Dataset Card for Evaluation run of NousResearch/Nous-Puffin-70B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NousResearch/Nous-Puffin-70B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NousResearch/Nous-Puffin-70B](https://huggingface.co/NousResearch/Nous-Puffin-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NousResearch__Nous-Puffin-70B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T17:19:58.299008](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__Nous-Puffin-70B/blob/main/results_2023-09-23T17-19-58.299008.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0019924496644295304,
"em_stderr": 0.00045666764626670005,
"f1": 0.06601090604026844,
"f1_stderr": 0.001371965767363261,
"acc": 0.5908367954724018,
"acc_stderr": 0.011701371531806812
},
"harness|drop|3": {
"em": 0.0019924496644295304,
"em_stderr": 0.00045666764626670005,
"f1": 0.06601090604026844,
"f1_stderr": 0.001371965767363261
},
"harness|gsm8k|5": {
"acc": 0.34268385140257773,
"acc_stderr": 0.01307303023082791
},
"harness|winogrande|5": {
"acc": 0.8389897395422258,
"acc_stderr": 0.010329712832785715
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_OpenBuddy__openbuddy-llama2-13b-v11-bf16 | 2023-08-27T12:43:59.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of OpenBuddy/openbuddy-llama2-13b-v11-bf16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenBuddy/openbuddy-llama2-13b-v11-bf16](https://huggingface.co/OpenBuddy/openbuddy-llama2-13b-v11-bf16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-llama2-13b-v11-bf16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-24T02:00:08.524632](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-llama2-13b-v11-bf16/blob/main/results_2023-08-24T02%3A00%3A08.524632.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5139803860489537,\n\
\ \"acc_stderr\": 0.034961863533068085,\n \"acc_norm\": 0.5179928159628271,\n\
\ \"acc_norm_stderr\": 0.034950345577642566,\n \"mc1\": 0.33047735618115054,\n\
\ \"mc1_stderr\": 0.016466769613698296,\n \"mc2\": 0.4794321676886404,\n\
\ \"mc2_stderr\": 0.015234888124968433\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.48890784982935154,\n \"acc_stderr\": 0.014607794914013053,\n\
\ \"acc_norm\": 0.5298634812286689,\n \"acc_norm_stderr\": 0.014585305840007104\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5580561641107349,\n\
\ \"acc_stderr\": 0.004956030970911512,\n \"acc_norm\": 0.7538338976299542,\n\
\ \"acc_norm_stderr\": 0.0042989606748115765\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249033,\n\
\ \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249033\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5320754716981132,\n \"acc_stderr\": 0.03070948699255655,\n\
\ \"acc_norm\": 0.5320754716981132,\n \"acc_norm_stderr\": 0.03070948699255655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5208333333333334,\n\
\ \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.5208333333333334,\n\
\ \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3988439306358382,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.3988439306358382,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.032321469162244675,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.032321469162244675\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.043391383225798615,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.043391383225798615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.32275132275132273,\n \"acc_stderr\": 0.024078943243597016,\n \"\
acc_norm\": 0.32275132275132273,\n \"acc_norm_stderr\": 0.024078943243597016\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5774193548387097,\n\
\ \"acc_stderr\": 0.02810096472427264,\n \"acc_norm\": 0.5774193548387097,\n\
\ \"acc_norm_stderr\": 0.02810096472427264\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.39408866995073893,\n \"acc_stderr\": 0.034381579670365446,\n\
\ \"acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.034381579670365446\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391244,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391244\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"\
acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7512953367875648,\n \"acc_stderr\": 0.031195840877700293,\n\
\ \"acc_norm\": 0.7512953367875648,\n \"acc_norm_stderr\": 0.031195840877700293\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.441025641025641,\n \"acc_stderr\": 0.025174048384000756,\n \
\ \"acc_norm\": 0.441025641025641,\n \"acc_norm_stderr\": 0.025174048384000756\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5378151260504201,\n \"acc_stderr\": 0.032385469487589795,\n\
\ \"acc_norm\": 0.5378151260504201,\n \"acc_norm_stderr\": 0.032385469487589795\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6825688073394496,\n \"acc_stderr\": 0.019957152198460493,\n \"\
acc_norm\": 0.6825688073394496,\n \"acc_norm_stderr\": 0.019957152198460493\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608043,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608043\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6813725490196079,\n \"acc_stderr\": 0.032702871814820796,\n \"\
acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.032702871814820796\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6708860759493671,\n \"acc_stderr\": 0.030587326294702368,\n \
\ \"acc_norm\": 0.6708860759493671,\n \"acc_norm_stderr\": 0.030587326294702368\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928276,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928276\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.04374928560599738,\n\
\ \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.04374928560599738\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.042369647530410184,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.042369647530410184\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6196319018404908,\n \"acc_stderr\": 0.03814269893261837,\n\
\ \"acc_norm\": 0.6196319018404908,\n \"acc_norm_stderr\": 0.03814269893261837\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764377,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764377\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.046897659372781335,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.046897659372781335\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.782051282051282,\n\
\ \"acc_stderr\": 0.02704685763071668,\n \"acc_norm\": 0.782051282051282,\n\
\ \"acc_norm_stderr\": 0.02704685763071668\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7011494252873564,\n\
\ \"acc_stderr\": 0.016369256815093144,\n \"acc_norm\": 0.7011494252873564,\n\
\ \"acc_norm_stderr\": 0.016369256815093144\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5346820809248555,\n \"acc_stderr\": 0.026854257928258875,\n\
\ \"acc_norm\": 0.5346820809248555,\n \"acc_norm_stderr\": 0.026854257928258875\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.014355911964767867,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.014355911964767867\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.02843109544417664,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.02843109544417664\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5787781350482315,\n\
\ \"acc_stderr\": 0.028043399858210628,\n \"acc_norm\": 0.5787781350482315,\n\
\ \"acc_norm_stderr\": 0.028043399858210628\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5154320987654321,\n \"acc_stderr\": 0.0278074900442762,\n\
\ \"acc_norm\": 0.5154320987654321,\n \"acc_norm_stderr\": 0.0278074900442762\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.02883892147125145,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.02883892147125145\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39765319426336376,\n\
\ \"acc_stderr\": 0.012499840347460643,\n \"acc_norm\": 0.39765319426336376,\n\
\ \"acc_norm_stderr\": 0.012499840347460643\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47794117647058826,\n \"acc_stderr\": 0.03034326422421352,\n\
\ \"acc_norm\": 0.47794117647058826,\n \"acc_norm_stderr\": 0.03034326422421352\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4869281045751634,\n \"acc_stderr\": 0.020220920829626916,\n \
\ \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.020220920829626916\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6163265306122448,\n \"acc_stderr\": 0.031130880396235936,\n\
\ \"acc_norm\": 0.6163265306122448,\n \"acc_norm_stderr\": 0.031130880396235936\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n\
\ \"acc_stderr\": 0.03235743789355042,\n \"acc_norm\": 0.7014925373134329,\n\
\ \"acc_norm_stderr\": 0.03235743789355042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33047735618115054,\n\
\ \"mc1_stderr\": 0.016466769613698296,\n \"mc2\": 0.4794321676886404,\n\
\ \"mc2_stderr\": 0.015234888124968433\n }\n}\n```"
repo_url: https://huggingface.co/OpenBuddy/openbuddy-llama2-13b-v11-bf16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|arc:challenge|25_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hellaswag|10_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T02:00:08.524632.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T02:00:08.524632.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_24T02_00_08.524632
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T02:00:08.524632.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T02:00:08.524632.parquet'
---
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-llama2-13b-v11-bf16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenBuddy/openbuddy-llama2-13b-v11-bf16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-llama2-13b-v11-bf16](https://huggingface.co/OpenBuddy/openbuddy-llama2-13b-v11-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-llama2-13b-v11-bf16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-24T02:00:08.524632](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-llama2-13b-v11-bf16/blob/main/results_2023-08-24T02%3A00%3A08.524632.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5139803860489537,
"acc_stderr": 0.034961863533068085,
"acc_norm": 0.5179928159628271,
"acc_norm_stderr": 0.034950345577642566,
"mc1": 0.33047735618115054,
"mc1_stderr": 0.016466769613698296,
"mc2": 0.4794321676886404,
"mc2_stderr": 0.015234888124968433
},
"harness|arc:challenge|25": {
"acc": 0.48890784982935154,
"acc_stderr": 0.014607794914013053,
"acc_norm": 0.5298634812286689,
"acc_norm_stderr": 0.014585305840007104
},
"harness|hellaswag|10": {
"acc": 0.5580561641107349,
"acc_stderr": 0.004956030970911512,
"acc_norm": 0.7538338976299542,
"acc_norm_stderr": 0.0042989606748115765
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5320754716981132,
"acc_stderr": 0.03070948699255655,
"acc_norm": 0.5320754716981132,
"acc_norm_stderr": 0.03070948699255655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5208333333333334,
"acc_stderr": 0.041775789507399935,
"acc_norm": 0.5208333333333334,
"acc_norm_stderr": 0.041775789507399935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.032321469162244675,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.032321469162244675
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.043391383225798615,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.043391383225798615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.32275132275132273,
"acc_stderr": 0.024078943243597016,
"acc_norm": 0.32275132275132273,
"acc_norm_stderr": 0.024078943243597016
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5774193548387097,
"acc_stderr": 0.02810096472427264,
"acc_norm": 0.5774193548387097,
"acc_norm_stderr": 0.02810096472427264
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39408866995073893,
"acc_stderr": 0.034381579670365446,
"acc_norm": 0.39408866995073893,
"acc_norm_stderr": 0.034381579670365446
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.03663974994391244,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.03663974994391244
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7512953367875648,
"acc_stderr": 0.031195840877700293,
"acc_norm": 0.7512953367875648,
"acc_norm_stderr": 0.031195840877700293
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.441025641025641,
"acc_stderr": 0.025174048384000756,
"acc_norm": 0.441025641025641,
"acc_norm_stderr": 0.025174048384000756
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.02773896963217609,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.02773896963217609
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5378151260504201,
"acc_stderr": 0.032385469487589795,
"acc_norm": 0.5378151260504201,
"acc_norm_stderr": 0.032385469487589795
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6825688073394496,
"acc_stderr": 0.019957152198460493,
"acc_norm": 0.6825688073394496,
"acc_norm_stderr": 0.019957152198460493
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608043,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.032702871814820796,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.032702871814820796
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6708860759493671,
"acc_stderr": 0.030587326294702368,
"acc_norm": 0.6708860759493671,
"acc_norm_stderr": 0.030587326294702368
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928276,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928276
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5343511450381679,
"acc_stderr": 0.04374928560599738,
"acc_norm": 0.5343511450381679,
"acc_norm_stderr": 0.04374928560599738
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.042369647530410184,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.042369647530410184
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6196319018404908,
"acc_stderr": 0.03814269893261837,
"acc_norm": 0.6196319018404908,
"acc_norm_stderr": 0.03814269893261837
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764377,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764377
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.046897659372781335,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.046897659372781335
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.782051282051282,
"acc_stderr": 0.02704685763071668,
"acc_norm": 0.782051282051282,
"acc_norm_stderr": 0.02704685763071668
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7011494252873564,
"acc_stderr": 0.016369256815093144,
"acc_norm": 0.7011494252873564,
"acc_norm_stderr": 0.016369256815093144
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5346820809248555,
"acc_stderr": 0.026854257928258875,
"acc_norm": 0.5346820809248555,
"acc_norm_stderr": 0.026854257928258875
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767867,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767867
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.02843109544417664,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.02843109544417664
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5787781350482315,
"acc_stderr": 0.028043399858210628,
"acc_norm": 0.5787781350482315,
"acc_norm_stderr": 0.028043399858210628
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5154320987654321,
"acc_stderr": 0.0278074900442762,
"acc_norm": 0.5154320987654321,
"acc_norm_stderr": 0.0278074900442762
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.02883892147125145,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.02883892147125145
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39765319426336376,
"acc_stderr": 0.012499840347460643,
"acc_norm": 0.39765319426336376,
"acc_norm_stderr": 0.012499840347460643
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47794117647058826,
"acc_stderr": 0.03034326422421352,
"acc_norm": 0.47794117647058826,
"acc_norm_stderr": 0.03034326422421352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4869281045751634,
"acc_stderr": 0.020220920829626916,
"acc_norm": 0.4869281045751634,
"acc_norm_stderr": 0.020220920829626916
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6163265306122448,
"acc_stderr": 0.031130880396235936,
"acc_norm": 0.6163265306122448,
"acc_norm_stderr": 0.031130880396235936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355042,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6783625730994152,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.6783625730994152,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33047735618115054,
"mc1_stderr": 0.016466769613698296,
"mc2": 0.4794321676886404,
"mc2_stderr": 0.015234888124968433
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_OpenBuddy__openbuddy-llama2-70b-v10.1-bf16 | 2023-08-27T12:44:01.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of OpenBuddy/openbuddy-llama2-70b-v10.1-bf16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenBuddy/openbuddy-llama2-70b-v10.1-bf16](https://huggingface.co/OpenBuddy/openbuddy-llama2-70b-v10.1-bf16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-llama2-70b-v10.1-bf16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-24T02:13:46.174691](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-llama2-70b-v10.1-bf16/blob/main/results_2023-08-24T02%3A13%3A46.174691.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6718040464702424,\n\
\ \"acc_stderr\": 0.03189106536237187,\n \"acc_norm\": 0.675785404249262,\n\
\ \"acc_norm_stderr\": 0.03186919440814038,\n \"mc1\": 0.3880048959608323,\n\
\ \"mc1_stderr\": 0.017058761501347976,\n \"mc2\": 0.5618433959764939,\n\
\ \"mc2_stderr\": 0.015069750245874362\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5827645051194539,\n \"acc_stderr\": 0.01440982551840308,\n\
\ \"acc_norm\": 0.6186006825938567,\n \"acc_norm_stderr\": 0.014194389086685247\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.632244572794264,\n\
\ \"acc_stderr\": 0.004812088620277191,\n \"acc_norm\": 0.8313085042820155,\n\
\ \"acc_norm_stderr\": 0.0037371387523369415\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7631578947368421,\n \"acc_stderr\": 0.034597776068105365,\n\
\ \"acc_norm\": 0.7631578947368421,\n \"acc_norm_stderr\": 0.034597776068105365\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.69,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.02537952491077839,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.02537952491077839\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723292,\n \"\
acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723292\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.030117688929503585,\n\
\ \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.030117688929503585\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8535353535353535,\n \"acc_stderr\": 0.025190921114603918,\n \"\
acc_norm\": 0.8535353535353535,\n \"acc_norm_stderr\": 0.025190921114603918\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6923076923076923,\n \"acc_stderr\": 0.023400928918310495,\n\
\ \"acc_norm\": 0.6923076923076923,\n \"acc_norm_stderr\": 0.023400928918310495\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3851851851851852,\n \"acc_stderr\": 0.02967090612463089,\n \
\ \"acc_norm\": 0.3851851851851852,\n \"acc_norm_stderr\": 0.02967090612463089\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7605042016806722,\n \"acc_stderr\": 0.027722065493361266,\n\
\ \"acc_norm\": 0.7605042016806722,\n \"acc_norm_stderr\": 0.027722065493361266\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4105960264900662,\n \"acc_stderr\": 0.04016689594849927,\n \"\
acc_norm\": 0.4105960264900662,\n \"acc_norm_stderr\": 0.04016689594849927\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8733944954128441,\n \"acc_stderr\": 0.014257128686165169,\n \"\
acc_norm\": 0.8733944954128441,\n \"acc_norm_stderr\": 0.014257128686165169\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5601851851851852,\n \"acc_stderr\": 0.0338517797604481,\n \"acc_norm\"\
: 0.5601851851851852,\n \"acc_norm_stderr\": 0.0338517797604481\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8774509803921569,\n\
\ \"acc_stderr\": 0.023015389732458258,\n \"acc_norm\": 0.8774509803921569,\n\
\ \"acc_norm_stderr\": 0.023015389732458258\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8438818565400844,\n \"acc_stderr\": 0.023627159460318667,\n\
\ \"acc_norm\": 0.8438818565400844,\n \"acc_norm_stderr\": 0.023627159460318667\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7533632286995515,\n\
\ \"acc_stderr\": 0.028930413120910877,\n \"acc_norm\": 0.7533632286995515,\n\
\ \"acc_norm_stderr\": 0.028930413120910877\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.03008309871603521,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.03008309871603521\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.030833491146281245,\n\
\ \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.030833491146281245\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6071428571428571,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.6071428571428571,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9316239316239316,\n\
\ \"acc_stderr\": 0.016534627684311364,\n \"acc_norm\": 0.9316239316239316,\n\
\ \"acc_norm_stderr\": 0.016534627684311364\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8378033205619413,\n\
\ \"acc_stderr\": 0.013182222616720887,\n \"acc_norm\": 0.8378033205619413,\n\
\ \"acc_norm_stderr\": 0.013182222616720887\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43910614525139663,\n\
\ \"acc_stderr\": 0.016598022120580425,\n \"acc_norm\": 0.43910614525139663,\n\
\ \"acc_norm_stderr\": 0.016598022120580425\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729487,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729487\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.752411575562701,\n\
\ \"acc_stderr\": 0.024513879973621967,\n \"acc_norm\": 0.752411575562701,\n\
\ \"acc_norm_stderr\": 0.024513879973621967\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.530638852672751,\n \"acc_stderr\": 0.012746237711716634,\n\
\ \"acc_norm\": 0.530638852672751,\n \"acc_norm_stderr\": 0.012746237711716634\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n \"\
acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7124183006535948,\n \"acc_stderr\": 0.018311653053648222,\n \
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.018311653053648222\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7795918367346939,\n \"acc_stderr\": 0.02653704531214529,\n\
\ \"acc_norm\": 0.7795918367346939,\n \"acc_norm_stderr\": 0.02653704531214529\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.02484575321230604,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.02484575321230604\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3880048959608323,\n\
\ \"mc1_stderr\": 0.017058761501347976,\n \"mc2\": 0.5618433959764939,\n\
\ \"mc2_stderr\": 0.015069750245874362\n }\n}\n```"
repo_url: https://huggingface.co/OpenBuddy/openbuddy-llama2-70b-v10.1-bf16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|arc:challenge|25_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hellaswag|10_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T02:13:46.174691.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T02:13:46.174691.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_24T02_13_46.174691
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T02:13:46.174691.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T02:13:46.174691.parquet'
---
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-llama2-70b-v10.1-bf16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenBuddy/openbuddy-llama2-70b-v10.1-bf16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-llama2-70b-v10.1-bf16](https://huggingface.co/OpenBuddy/openbuddy-llama2-70b-v10.1-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-llama2-70b-v10.1-bf16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-24T02:13:46.174691](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-llama2-70b-v10.1-bf16/blob/main/results_2023-08-24T02%3A13%3A46.174691.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6718040464702424,
"acc_stderr": 0.03189106536237187,
"acc_norm": 0.675785404249262,
"acc_norm_stderr": 0.03186919440814038,
"mc1": 0.3880048959608323,
"mc1_stderr": 0.017058761501347976,
"mc2": 0.5618433959764939,
"mc2_stderr": 0.015069750245874362
},
"harness|arc:challenge|25": {
"acc": 0.5827645051194539,
"acc_stderr": 0.01440982551840308,
"acc_norm": 0.6186006825938567,
"acc_norm_stderr": 0.014194389086685247
},
"harness|hellaswag|10": {
"acc": 0.632244572794264,
"acc_stderr": 0.004812088620277191,
"acc_norm": 0.8313085042820155,
"acc_norm_stderr": 0.0037371387523369415
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7631578947368421,
"acc_stderr": 0.034597776068105365,
"acc_norm": 0.7631578947368421,
"acc_norm_stderr": 0.034597776068105365
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.02537952491077839,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.02537952491077839
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723292,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723292
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.03515895551165698,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.03515895551165698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.030117688929503585,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.030117688929503585
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8535353535353535,
"acc_stderr": 0.025190921114603918,
"acc_norm": 0.8535353535353535,
"acc_norm_stderr": 0.025190921114603918
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.023400928918310495,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.023400928918310495
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.02967090612463089,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.02967090612463089
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7605042016806722,
"acc_stderr": 0.027722065493361266,
"acc_norm": 0.7605042016806722,
"acc_norm_stderr": 0.027722065493361266
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4105960264900662,
"acc_stderr": 0.04016689594849927,
"acc_norm": 0.4105960264900662,
"acc_norm_stderr": 0.04016689594849927
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8733944954128441,
"acc_stderr": 0.014257128686165169,
"acc_norm": 0.8733944954128441,
"acc_norm_stderr": 0.014257128686165169
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.0338517797604481,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.0338517797604481
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8774509803921569,
"acc_stderr": 0.023015389732458258,
"acc_norm": 0.8774509803921569,
"acc_norm_stderr": 0.023015389732458258
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8438818565400844,
"acc_stderr": 0.023627159460318667,
"acc_norm": 0.8438818565400844,
"acc_norm_stderr": 0.023627159460318667
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7533632286995515,
"acc_stderr": 0.028930413120910877,
"acc_norm": 0.7533632286995515,
"acc_norm_stderr": 0.028930413120910877
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.03008309871603521,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.03008309871603521
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.03602814176392645,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.03602814176392645
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8098159509202454,
"acc_stderr": 0.030833491146281245,
"acc_norm": 0.8098159509202454,
"acc_norm_stderr": 0.030833491146281245
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6071428571428571,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.6071428571428571,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9316239316239316,
"acc_stderr": 0.016534627684311364,
"acc_norm": 0.9316239316239316,
"acc_norm_stderr": 0.016534627684311364
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8378033205619413,
"acc_stderr": 0.013182222616720887,
"acc_norm": 0.8378033205619413,
"acc_norm_stderr": 0.013182222616720887
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43910614525139663,
"acc_stderr": 0.016598022120580425,
"acc_norm": 0.43910614525139663,
"acc_norm_stderr": 0.016598022120580425
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729487,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729487
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.752411575562701,
"acc_stderr": 0.024513879973621967,
"acc_norm": 0.752411575562701,
"acc_norm_stderr": 0.024513879973621967
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.530638852672751,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.530638852672751,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.018311653053648222,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.018311653053648222
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7795918367346939,
"acc_stderr": 0.02653704531214529,
"acc_norm": 0.7795918367346939,
"acc_norm_stderr": 0.02653704531214529
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.02484575321230604,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.02484575321230604
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3880048959608323,
"mc1_stderr": 0.017058761501347976,
"mc2": 0.5618433959764939,
"mc2_stderr": 0.015069750245874362
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_bigcode__santacoder | 2023-08-27T12:44:03.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of bigcode/santacoder
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bigcode/santacoder](https://huggingface.co/bigcode/santacoder) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bigcode__santacoder\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-23T16:23:33.954864](https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__santacoder/blob/main/results_2023-08-23T16%3A23%3A33.954864.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25850286195198124,\n\
\ \"acc_stderr\": 0.03179685758787311,\n \"acc_norm\": 0.25894177745866664,\n\
\ \"acc_norm_stderr\": 0.0318042099829945,\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766367,\n \"mc2\": 0.5124422674921806,\n\
\ \"mc2_stderr\": 0.016978714870018875\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.23720136518771331,\n \"acc_stderr\": 0.01243039982926084,\n\
\ \"acc_norm\": 0.2627986348122867,\n \"acc_norm_stderr\": 0.012862523175351333\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2557259510057757,\n\
\ \"acc_stderr\": 0.0043537687306445605,\n \"acc_norm\": 0.2560246962756423,\n\
\ \"acc_norm_stderr\": 0.004355436696716298\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.037498507091740206,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.037498507091740206\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3026315789473684,\n \"acc_stderr\": 0.03738520676119667,\n\
\ \"acc_norm\": 0.3026315789473684,\n \"acc_norm_stderr\": 0.03738520676119667\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610334,\n\
\ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610334\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924811,\n\
\ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924811\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24074074074074073,\n \"acc_stderr\": 0.02201908001221789,\n \"\
acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.02201908001221789\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.040061680838488774,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.040061680838488774\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.267741935483871,\n\
\ \"acc_stderr\": 0.02518900666021238,\n \"acc_norm\": 0.267741935483871,\n\
\ \"acc_norm_stderr\": 0.02518900666021238\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3251231527093596,\n \"acc_stderr\": 0.03295797566311271,\n\
\ \"acc_norm\": 0.3251231527093596,\n \"acc_norm_stderr\": 0.03295797566311271\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.03477691162163659,\n\
\ \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03477691162163659\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945637,\n \"\
acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945637\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.24870466321243523,\n \"acc_stderr\": 0.031195840877700307,\n\
\ \"acc_norm\": 0.24870466321243523,\n \"acc_norm_stderr\": 0.031195840877700307\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2205128205128205,\n \"acc_stderr\": 0.02102067268082791,\n \
\ \"acc_norm\": 0.2205128205128205,\n \"acc_norm_stderr\": 0.02102067268082791\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868966,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868966\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.034791855725996614,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.034791855725996614\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24403669724770644,\n \"acc_stderr\": 0.018415286351416416,\n \"\
acc_norm\": 0.24403669724770644,\n \"acc_norm_stderr\": 0.018415286351416416\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.02988691054762697,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02988691054762697\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.27941176470588236,\n \"acc_stderr\": 0.031493281045079556,\n \"\
acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.031493281045079556\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.20675105485232068,\n \"acc_stderr\": 0.026361651668389104,\n \
\ \"acc_norm\": 0.20675105485232068,\n \"acc_norm_stderr\": 0.026361651668389104\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.37668161434977576,\n\
\ \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.37668161434977576,\n\
\ \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591204,\n \"\
acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591204\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n\
\ \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n\
\ \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\
\ \"acc_stderr\": 0.028605953702004253,\n \"acc_norm\": 0.2564102564102564,\n\
\ \"acc_norm_stderr\": 0.028605953702004253\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2886334610472541,\n\
\ \"acc_stderr\": 0.016203792703197804,\n \"acc_norm\": 0.2886334610472541,\n\
\ \"acc_norm_stderr\": 0.016203792703197804\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2745664739884393,\n \"acc_stderr\": 0.02402774515526501,\n\
\ \"acc_norm\": 0.2745664739884393,\n \"acc_norm_stderr\": 0.02402774515526501\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n\
\ \"acc_stderr\": 0.014736926383761983,\n \"acc_norm\": 0.2636871508379888,\n\
\ \"acc_norm_stderr\": 0.014736926383761983\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.02465968518596728,\n\
\ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.02465968518596728\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290403,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290403\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24511082138200782,\n\
\ \"acc_stderr\": 0.010986307870045531,\n \"acc_norm\": 0.24511082138200782,\n\
\ \"acc_norm_stderr\": 0.010986307870045531\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3014705882352941,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.3014705882352941,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2696078431372549,\n \"acc_stderr\": 0.017952449196987866,\n \
\ \"acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.017952449196987866\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.33636363636363636,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.33636363636363636,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3510204081632653,\n \"acc_stderr\": 0.03055531675557364,\n\
\ \"acc_norm\": 0.3510204081632653,\n \"acc_norm_stderr\": 0.03055531675557364\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.03036049015401466,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.03036049015401466\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\
\ \"acc_stderr\": 0.0362933532994786,\n \"acc_norm\": 0.3192771084337349,\n\
\ \"acc_norm_stderr\": 0.0362933532994786\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766367,\n \"mc2\": 0.5124422674921806,\n\
\ \"mc2_stderr\": 0.016978714870018875\n }\n}\n```"
repo_url: https://huggingface.co/bigcode/santacoder
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|arc:challenge|25_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hellaswag|10_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T16:23:33.954864.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T16:23:33.954864.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T16_23_33.954864
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T16:23:33.954864.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T16:23:33.954864.parquet'
---
# Dataset Card for Evaluation run of bigcode/santacoder
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bigcode/santacoder
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [bigcode/santacoder](https://huggingface.co/bigcode/santacoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bigcode__santacoder",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-23T16:23:33.954864](https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__santacoder/blob/main/results_2023-08-23T16%3A23%3A33.954864.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25850286195198124,
"acc_stderr": 0.03179685758787311,
"acc_norm": 0.25894177745866664,
"acc_norm_stderr": 0.0318042099829945,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766367,
"mc2": 0.5124422674921806,
"mc2_stderr": 0.016978714870018875
},
"harness|arc:challenge|25": {
"acc": 0.23720136518771331,
"acc_stderr": 0.01243039982926084,
"acc_norm": 0.2627986348122867,
"acc_norm_stderr": 0.012862523175351333
},
"harness|hellaswag|10": {
"acc": 0.2557259510057757,
"acc_stderr": 0.0043537687306445605,
"acc_norm": 0.2560246962756423,
"acc_norm_stderr": 0.004355436696716298
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.037498507091740206,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.037498507091740206
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3026315789473684,
"acc_stderr": 0.03738520676119667,
"acc_norm": 0.3026315789473684,
"acc_norm_stderr": 0.03738520676119667
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.030579442773610334,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.030579442773610334
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924811,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924811
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.02201908001221789,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.02201908001221789
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.040061680838488774,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.040061680838488774
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.267741935483871,
"acc_stderr": 0.02518900666021238,
"acc_norm": 0.267741935483871,
"acc_norm_stderr": 0.02518900666021238
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3251231527093596,
"acc_stderr": 0.03295797566311271,
"acc_norm": 0.3251231527093596,
"acc_norm_stderr": 0.03295797566311271
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21717171717171718,
"acc_stderr": 0.029376616484945637,
"acc_norm": 0.21717171717171718,
"acc_norm_stderr": 0.029376616484945637
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24870466321243523,
"acc_stderr": 0.031195840877700307,
"acc_norm": 0.24870466321243523,
"acc_norm_stderr": 0.031195840877700307
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2205128205128205,
"acc_stderr": 0.02102067268082791,
"acc_norm": 0.2205128205128205,
"acc_norm_stderr": 0.02102067268082791
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868966,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868966
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.034791855725996614,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.034791855725996614
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24403669724770644,
"acc_stderr": 0.018415286351416416,
"acc_norm": 0.24403669724770644,
"acc_norm_stderr": 0.018415286351416416
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02988691054762697,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02988691054762697
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.031493281045079556,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.031493281045079556
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.20675105485232068,
"acc_stderr": 0.026361651668389104,
"acc_norm": 0.20675105485232068,
"acc_norm_stderr": 0.026361651668389104
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.37668161434977576,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.37668161434977576,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.04026187527591204,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.04026187527591204
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.028605953702004253,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.028605953702004253
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2886334610472541,
"acc_stderr": 0.016203792703197804,
"acc_norm": 0.2886334610472541,
"acc_norm_stderr": 0.016203792703197804
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2745664739884393,
"acc_stderr": 0.02402774515526501,
"acc_norm": 0.2745664739884393,
"acc_norm_stderr": 0.02402774515526501
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2636871508379888,
"acc_stderr": 0.014736926383761983,
"acc_norm": 0.2636871508379888,
"acc_norm_stderr": 0.014736926383761983
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.02465968518596728,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.02465968518596728
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290403,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290403
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24511082138200782,
"acc_stderr": 0.010986307870045531,
"acc_norm": 0.24511082138200782,
"acc_norm_stderr": 0.010986307870045531
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3014705882352941,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.3014705882352941,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.017952449196987866,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.017952449196987866
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.33636363636363636,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.33636363636363636,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3510204081632653,
"acc_stderr": 0.03055531675557364,
"acc_norm": 0.3510204081632653,
"acc_norm_stderr": 0.03055531675557364
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401466,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401466
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.0362933532994786,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.0362933532994786
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766367,
"mc2": 0.5124422674921806,
"mc2_stderr": 0.016978714870018875
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_acrastt__Marx-3B-V2 | 2023-08-27T12:44:05.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of acrastt/Marx-3B-V2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [acrastt/Marx-3B-V2](https://huggingface.co/acrastt/Marx-3B-V2) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_acrastt__Marx-3B-V2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-22T23:34:31.672257](https://huggingface.co/datasets/open-llm-leaderboard/details_acrastt__Marx-3B-V2/blob/main/results_2023-08-22T23%3A34%3A31.672257.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2849908485197027,\n\
\ \"acc_stderr\": 0.03259762868299022,\n \"acc_norm\": 0.2887484166496788,\n\
\ \"acc_norm_stderr\": 0.03259191472696891,\n \"mc1\": 0.24724602203182375,\n\
\ \"mc1_stderr\": 0.015102404797359652,\n \"mc2\": 0.39919834030421253,\n\
\ \"mc2_stderr\": 0.014149154740446326\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3993174061433447,\n \"acc_stderr\": 0.0143120945579467,\n\
\ \"acc_norm\": 0.4402730375426621,\n \"acc_norm_stderr\": 0.014506769524804241\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.548496315475005,\n\
\ \"acc_stderr\": 0.004966255089212425,\n \"acc_norm\": 0.729237203744274,\n\
\ \"acc_norm_stderr\": 0.004434456717097584\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37037037037037035,\n\
\ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.37037037037037035,\n\
\ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.35526315789473684,\n \"acc_stderr\": 0.038947344870133176,\n\
\ \"acc_norm\": 0.35526315789473684,\n \"acc_norm_stderr\": 0.038947344870133176\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3018867924528302,\n \"acc_stderr\": 0.028254200344438655,\n\
\ \"acc_norm\": 0.3018867924528302,\n \"acc_norm_stderr\": 0.028254200344438655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036843,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036843\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.28936170212765955,\n \"acc_stderr\": 0.029644006577009618,\n\
\ \"acc_norm\": 0.28936170212765955,\n \"acc_norm_stderr\": 0.029644006577009618\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281334,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708617,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708617\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n\
\ \"acc_stderr\": 0.03395490020856113,\n \"acc_norm\": 0.1746031746031746,\n\
\ \"acc_norm_stderr\": 0.03395490020856113\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.267741935483871,\n\
\ \"acc_stderr\": 0.025189006660212385,\n \"acc_norm\": 0.267741935483871,\n\
\ \"acc_norm_stderr\": 0.025189006660212385\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358611,\n\
\ \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358611\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.32727272727272727,\n \"acc_stderr\": 0.03663974994391242,\n\
\ \"acc_norm\": 0.32727272727272727,\n \"acc_norm_stderr\": 0.03663974994391242\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.03191178226713547,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.03191178226713547\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.03003114797764154,\n\
\ \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.03003114797764154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2717948717948718,\n \"acc_stderr\": 0.02255655101013233,\n \
\ \"acc_norm\": 0.2717948717948718,\n \"acc_norm_stderr\": 0.02255655101013233\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712163,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712163\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23949579831932774,\n \"acc_stderr\": 0.027722065493361276,\n\
\ \"acc_norm\": 0.23949579831932774,\n \"acc_norm_stderr\": 0.027722065493361276\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119996,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119996\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.27339449541284405,\n \"acc_stderr\": 0.019109299846098285,\n \"\
acc_norm\": 0.27339449541284405,\n \"acc_norm_stderr\": 0.019109299846098285\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.35185185185185186,\n \"acc_stderr\": 0.03256850570293648,\n \"\
acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.03256850570293648\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2647058823529412,\n \"acc_stderr\": 0.030964517926923403,\n \"\
acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.030964517926923403\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2869198312236287,\n \"acc_stderr\": 0.029443773022594693,\n \
\ \"acc_norm\": 0.2869198312236287,\n \"acc_norm_stderr\": 0.029443773022594693\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34977578475336324,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.34977578475336324,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.183206106870229,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.183206106870229,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.36363636363636365,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952687,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952687\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2912621359223301,\n \"acc_stderr\": 0.044986763205729224,\n\
\ \"acc_norm\": 0.2912621359223301,\n \"acc_norm_stderr\": 0.044986763205729224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n\
\ \"acc_stderr\": 0.028911208802749458,\n \"acc_norm\": 0.26495726495726496,\n\
\ \"acc_norm_stderr\": 0.028911208802749458\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3090676883780332,\n\
\ \"acc_stderr\": 0.016524988919702183,\n \"acc_norm\": 0.3090676883780332,\n\
\ \"acc_norm_stderr\": 0.016524988919702183\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2745664739884393,\n \"acc_stderr\": 0.024027745155265016,\n\
\ \"acc_norm\": 0.2745664739884393,\n \"acc_norm_stderr\": 0.024027745155265016\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.264804469273743,\n\
\ \"acc_stderr\": 0.014756906483260659,\n \"acc_norm\": 0.264804469273743,\n\
\ \"acc_norm_stderr\": 0.014756906483260659\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.025646863097137904,\n\
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.025646863097137904\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3054662379421222,\n\
\ \"acc_stderr\": 0.026160584450140478,\n \"acc_norm\": 0.3054662379421222,\n\
\ \"acc_norm_stderr\": 0.026160584450140478\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.30864197530864196,\n \"acc_stderr\": 0.025702640260603753,\n\
\ \"acc_norm\": 0.30864197530864196,\n \"acc_norm_stderr\": 0.025702640260603753\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590638,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590638\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26010430247718386,\n\
\ \"acc_stderr\": 0.011204382887823843,\n \"acc_norm\": 0.26010430247718386,\n\
\ \"acc_norm_stderr\": 0.011204382887823843\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.02352924218519311,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.02352924218519311\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2565359477124183,\n \"acc_stderr\": 0.017667841612379,\n \
\ \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.017667841612379\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3306122448979592,\n \"acc_stderr\": 0.030116426296540606,\n\
\ \"acc_norm\": 0.3306122448979592,\n \"acc_norm_stderr\": 0.030116426296540606\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n\
\ \"acc_stderr\": 0.031157150869355558,\n \"acc_norm\": 0.263681592039801,\n\
\ \"acc_norm_stderr\": 0.031157150869355558\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n\
\ \"acc_stderr\": 0.03647168523683227,\n \"acc_norm\": 0.3253012048192771,\n\
\ \"acc_norm_stderr\": 0.03647168523683227\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.035650796707083106,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.035650796707083106\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24724602203182375,\n\
\ \"mc1_stderr\": 0.015102404797359652,\n \"mc2\": 0.39919834030421253,\n\
\ \"mc2_stderr\": 0.014149154740446326\n }\n}\n```"
repo_url: https://huggingface.co/acrastt/Marx-3B-V2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|arc:challenge|25_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hellaswag|10_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T23:34:31.672257.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T23:34:31.672257.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T23_34_31.672257
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T23:34:31.672257.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T23:34:31.672257.parquet'
---
# Dataset Card for Evaluation run of acrastt/Marx-3B-V2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/acrastt/Marx-3B-V2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [acrastt/Marx-3B-V2](https://huggingface.co/acrastt/Marx-3B-V2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_acrastt__Marx-3B-V2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-22T23:34:31.672257](https://huggingface.co/datasets/open-llm-leaderboard/details_acrastt__Marx-3B-V2/blob/main/results_2023-08-22T23%3A34%3A31.672257.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2849908485197027,
"acc_stderr": 0.03259762868299022,
"acc_norm": 0.2887484166496788,
"acc_norm_stderr": 0.03259191472696891,
"mc1": 0.24724602203182375,
"mc1_stderr": 0.015102404797359652,
"mc2": 0.39919834030421253,
"mc2_stderr": 0.014149154740446326
},
"harness|arc:challenge|25": {
"acc": 0.3993174061433447,
"acc_stderr": 0.0143120945579467,
"acc_norm": 0.4402730375426621,
"acc_norm_stderr": 0.014506769524804241
},
"harness|hellaswag|10": {
"acc": 0.548496315475005,
"acc_stderr": 0.004966255089212425,
"acc_norm": 0.729237203744274,
"acc_norm_stderr": 0.004434456717097584
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.35526315789473684,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.35526315789473684,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3018867924528302,
"acc_stderr": 0.028254200344438655,
"acc_norm": 0.3018867924528302,
"acc_norm_stderr": 0.028254200344438655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036843,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036843
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28936170212765955,
"acc_stderr": 0.029644006577009618,
"acc_norm": 0.28936170212765955,
"acc_norm_stderr": 0.029644006577009618
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281334,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708617,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708617
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.03395490020856113,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.03395490020856113
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.267741935483871,
"acc_stderr": 0.025189006660212385,
"acc_norm": 0.267741935483871,
"acc_norm_stderr": 0.025189006660212385
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.03161856335358611,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.03161856335358611
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.32727272727272727,
"acc_stderr": 0.03663974994391242,
"acc_norm": 0.32727272727272727,
"acc_norm_stderr": 0.03663974994391242
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.03191178226713547,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.03191178226713547
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2717948717948718,
"acc_stderr": 0.02255655101013233,
"acc_norm": 0.2717948717948718,
"acc_norm_stderr": 0.02255655101013233
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712163,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712163
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23949579831932774,
"acc_stderr": 0.027722065493361276,
"acc_norm": 0.23949579831932774,
"acc_norm_stderr": 0.027722065493361276
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119996,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119996
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.27339449541284405,
"acc_stderr": 0.019109299846098285,
"acc_norm": 0.27339449541284405,
"acc_norm_stderr": 0.019109299846098285
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.03256850570293648,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.03256850570293648
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.030964517926923403,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.030964517926923403
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2869198312236287,
"acc_stderr": 0.029443773022594693,
"acc_norm": 0.2869198312236287,
"acc_norm_stderr": 0.029443773022594693
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34977578475336324,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.34977578475336324,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.183206106870229,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.183206106870229,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952687,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952687
},
"harness|hendrycksTest-management|5": {
"acc": 0.2912621359223301,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.2912621359223301,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.028911208802749458,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.028911208802749458
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3090676883780332,
"acc_stderr": 0.016524988919702183,
"acc_norm": 0.3090676883780332,
"acc_norm_stderr": 0.016524988919702183
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2745664739884393,
"acc_stderr": 0.024027745155265016,
"acc_norm": 0.2745664739884393,
"acc_norm_stderr": 0.024027745155265016
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.264804469273743,
"acc_stderr": 0.014756906483260659,
"acc_norm": 0.264804469273743,
"acc_norm_stderr": 0.014756906483260659
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.025646863097137904,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.025646863097137904
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3054662379421222,
"acc_stderr": 0.026160584450140478,
"acc_norm": 0.3054662379421222,
"acc_norm_stderr": 0.026160584450140478
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.30864197530864196,
"acc_stderr": 0.025702640260603753,
"acc_norm": 0.30864197530864196,
"acc_norm_stderr": 0.025702640260603753
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590638,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590638
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26010430247718386,
"acc_stderr": 0.011204382887823843,
"acc_norm": 0.26010430247718386,
"acc_norm_stderr": 0.011204382887823843
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.02352924218519311,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.02352924218519311
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2565359477124183,
"acc_stderr": 0.017667841612379,
"acc_norm": 0.2565359477124183,
"acc_norm_stderr": 0.017667841612379
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3306122448979592,
"acc_stderr": 0.030116426296540606,
"acc_norm": 0.3306122448979592,
"acc_norm_stderr": 0.030116426296540606
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.031157150869355558,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.031157150869355558
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.03647168523683227,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.03647168523683227
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.035650796707083106,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.035650796707083106
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24724602203182375,
"mc1_stderr": 0.015102404797359652,
"mc2": 0.39919834030421253,
"mc2_stderr": 0.014149154740446326
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_mosaicml__mpt-7b-8k-instruct | 2023-10-03T22:47:37.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of mosaicml/mpt-7b-8k-instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mosaicml/mpt-7b-8k-instruct](https://huggingface.co/mosaicml/mpt-7b-8k-instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mosaicml__mpt-7b-8k-instruct\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T22:46:19.065505](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-7b-8k-instruct/blob/main/results_2023-10-03T22-46-19.065505.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4235929462139018,\n\
\ \"acc_stderr\": 0.03519064012788231,\n \"acc_norm\": 0.42686334313202534,\n\
\ \"acc_norm_stderr\": 0.03518177126224397,\n \"mc1\": 0.21542227662178703,\n\
\ \"mc1_stderr\": 0.01439190265242768,\n \"mc2\": 0.3542683630494196,\n\
\ \"mc2_stderr\": 0.01528351213467969\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4325938566552901,\n \"acc_stderr\": 0.014478005694182528,\n\
\ \"acc_norm\": 0.45307167235494883,\n \"acc_norm_stderr\": 0.01454689205200563\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5736904999004182,\n\
\ \"acc_stderr\": 0.004935291975579197,\n \"acc_norm\": 0.7461661023700458,\n\
\ \"acc_norm_stderr\": 0.004343142545094253\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3851851851851852,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.3851851851851852,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.04017901275981749,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.04017901275981749\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4226415094339623,\n \"acc_stderr\": 0.030402331445769537,\n\
\ \"acc_norm\": 0.4226415094339623,\n \"acc_norm_stderr\": 0.030402331445769537\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4305555555555556,\n\
\ \"acc_stderr\": 0.04140685639111502,\n \"acc_norm\": 0.4305555555555556,\n\
\ \"acc_norm_stderr\": 0.04140685639111502\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.34104046242774566,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.34104046242774566,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101736,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101736\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.04227054451232199,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.04227054451232199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30687830687830686,\n \"acc_stderr\": 0.02375292871211214,\n \"\
acc_norm\": 0.30687830687830686,\n \"acc_norm_stderr\": 0.02375292871211214\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.03764950879790605,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.03764950879790605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.45483870967741935,\n\
\ \"acc_stderr\": 0.028327743091561067,\n \"acc_norm\": 0.45483870967741935,\n\
\ \"acc_norm_stderr\": 0.028327743091561067\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.24630541871921183,\n \"acc_stderr\": 0.030315099285617732,\n\
\ \"acc_norm\": 0.24630541871921183,\n \"acc_norm_stderr\": 0.030315099285617732\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.038154943086889305,\n\
\ \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.038154943086889305\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4494949494949495,\n \"acc_stderr\": 0.0354413249194797,\n \"acc_norm\"\
: 0.4494949494949495,\n \"acc_norm_stderr\": 0.0354413249194797\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.5906735751295337,\n \"acc_stderr\": 0.03548608168860806,\n\
\ \"acc_norm\": 0.5906735751295337,\n \"acc_norm_stderr\": 0.03548608168860806\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.34102564102564104,\n \"acc_stderr\": 0.024035489676335065,\n\
\ \"acc_norm\": 0.34102564102564104,\n \"acc_norm_stderr\": 0.024035489676335065\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230193,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230193\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3739495798319328,\n \"acc_stderr\": 0.031429466378837076,\n\
\ \"acc_norm\": 0.3739495798319328,\n \"acc_norm_stderr\": 0.031429466378837076\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5724770642201835,\n\
\ \"acc_stderr\": 0.021210910204300434,\n \"acc_norm\": 0.5724770642201835,\n\
\ \"acc_norm_stderr\": 0.021210910204300434\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.03085199299325701,\n\
\ \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.03085199299325701\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5931372549019608,\n \"acc_stderr\": 0.03447891136353382,\n \"\
acc_norm\": 0.5931372549019608,\n \"acc_norm_stderr\": 0.03447891136353382\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6329113924050633,\n \"acc_stderr\": 0.031376240725616185,\n \
\ \"acc_norm\": 0.6329113924050633,\n \"acc_norm_stderr\": 0.031376240725616185\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4618834080717489,\n\
\ \"acc_stderr\": 0.03346015011973228,\n \"acc_norm\": 0.4618834080717489,\n\
\ \"acc_norm_stderr\": 0.03346015011973228\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5038167938931297,\n \"acc_stderr\": 0.043851623256015534,\n\
\ \"acc_norm\": 0.5038167938931297,\n \"acc_norm_stderr\": 0.043851623256015534\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.48760330578512395,\n \"acc_stderr\": 0.04562951548180765,\n \"\
acc_norm\": 0.48760330578512395,\n \"acc_norm_stderr\": 0.04562951548180765\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3803680981595092,\n \"acc_stderr\": 0.03814269893261836,\n\
\ \"acc_norm\": 0.3803680981595092,\n \"acc_norm_stderr\": 0.03814269893261836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.49514563106796117,\n \"acc_stderr\": 0.049505043821289195,\n\
\ \"acc_norm\": 0.49514563106796117,\n \"acc_norm_stderr\": 0.049505043821289195\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5726495726495726,\n\
\ \"acc_stderr\": 0.032408473935163266,\n \"acc_norm\": 0.5726495726495726,\n\
\ \"acc_norm_stderr\": 0.032408473935163266\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-miscellaneous|5\"\
: {\n \"acc\": 0.5670498084291188,\n \"acc_stderr\": 0.017718469101513982,\n\
\ \"acc_norm\": 0.5670498084291188,\n \"acc_norm_stderr\": 0.017718469101513982\n\
\ },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4046242774566474,\n\
\ \"acc_stderr\": 0.026424816594009852,\n \"acc_norm\": 0.4046242774566474,\n\
\ \"acc_norm_stderr\": 0.026424816594009852\n },\n \"harness|hendrycksTest-moral_scenarios|5\"\
: {\n \"acc\": 0.2860335195530726,\n \"acc_stderr\": 0.015113972129062127,\n\
\ \"acc_norm\": 0.2860335195530726,\n \"acc_norm_stderr\": 0.015113972129062127\n\
\ },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.45098039215686275,\n\
\ \"acc_stderr\": 0.02849199358617156,\n \"acc_norm\": 0.45098039215686275,\n\
\ \"acc_norm_stderr\": 0.02849199358617156\n },\n \"harness|hendrycksTest-philosophy|5\"\
: {\n \"acc\": 0.4855305466237942,\n \"acc_stderr\": 0.02838619808417768,\n\
\ \"acc_norm\": 0.4855305466237942,\n \"acc_norm_stderr\": 0.02838619808417768\n\
\ },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.43209876543209874,\n\
\ \"acc_stderr\": 0.027563010971606676,\n \"acc_norm\": 0.43209876543209874,\n\
\ \"acc_norm_stderr\": 0.027563010971606676\n },\n \"harness|hendrycksTest-professional_accounting|5\"\
: {\n \"acc\": 0.28368794326241137,\n \"acc_stderr\": 0.026891709428343957,\n\
\ \"acc_norm\": 0.28368794326241137,\n \"acc_norm_stderr\": 0.026891709428343957\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3324641460234681,\n\
\ \"acc_stderr\": 0.012032022332260518,\n \"acc_norm\": 0.3324641460234681,\n\
\ \"acc_norm_stderr\": 0.012032022332260518\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3492647058823529,\n \"acc_stderr\": 0.028959755196824866,\n\
\ \"acc_norm\": 0.3492647058823529,\n \"acc_norm_stderr\": 0.028959755196824866\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4411764705882353,\n \"acc_stderr\": 0.020087362076702853,\n \
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.020087362076702853\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n\
\ \"acc_stderr\": 0.04738198703545483,\n \"acc_norm\": 0.5727272727272728,\n\
\ \"acc_norm_stderr\": 0.04738198703545483\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.39591836734693875,\n \"acc_stderr\": 0.03130802899065686,\n\
\ \"acc_norm\": 0.39591836734693875,\n \"acc_norm_stderr\": 0.03130802899065686\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5970149253731343,\n\
\ \"acc_stderr\": 0.034683432951111266,\n \"acc_norm\": 0.5970149253731343,\n\
\ \"acc_norm_stderr\": 0.034683432951111266\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.038194861407583984,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.038194861407583984\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5380116959064327,\n \"acc_stderr\": 0.03823727092882307,\n\
\ \"acc_norm\": 0.5380116959064327,\n \"acc_norm_stderr\": 0.03823727092882307\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21542227662178703,\n\
\ \"mc1_stderr\": 0.01439190265242768,\n \"mc2\": 0.3542683630494196,\n\
\ \"mc2_stderr\": 0.01528351213467969\n }\n}\n```"
repo_url: https://huggingface.co/mosaicml/mpt-7b-8k-instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|arc:challenge|25_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|arc:challenge|25_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T05_18_17.182274
path:
- '**/details_harness|drop|3_2023-09-23T05-18-17.182274.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T05-18-17.182274.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T05_18_17.182274
path:
- '**/details_harness|gsm8k|5_2023-09-23T05-18-17.182274.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T05-18-17.182274.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hellaswag|10_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hellaswag|10_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T22:50:02.593202.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T22-46-19.065505.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T22_50_02.593202
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T22:50:02.593202.parquet'
- split: 2023_10_03T22_46_19.065505
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T22-46-19.065505.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T22-46-19.065505.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T05_18_17.182274
path:
- '**/details_harness|winogrande|5_2023-09-23T05-18-17.182274.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T05-18-17.182274.parquet'
- config_name: results
data_files:
- split: 2023_09_23T05_18_17.182274
path:
- results_2023-09-23T05-18-17.182274.parquet
- split: 2023_10_03T22_46_19.065505
path:
- results_2023-10-03T22-46-19.065505.parquet
- split: latest
path:
- results_2023-10-03T22-46-19.065505.parquet
---
# Dataset Card for Evaluation run of mosaicml/mpt-7b-8k-instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mosaicml/mpt-7b-8k-instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mosaicml/mpt-7b-8k-instruct](https://huggingface.co/mosaicml/mpt-7b-8k-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mosaicml__mpt-7b-8k-instruct",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T22:46:19.065505](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-7b-8k-instruct/blob/main/results_2023-10-03T22-46-19.065505.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4235929462139018,
"acc_stderr": 0.03519064012788231,
"acc_norm": 0.42686334313202534,
"acc_norm_stderr": 0.03518177126224397,
"mc1": 0.21542227662178703,
"mc1_stderr": 0.01439190265242768,
"mc2": 0.3542683630494196,
"mc2_stderr": 0.01528351213467969
},
"harness|arc:challenge|25": {
"acc": 0.4325938566552901,
"acc_stderr": 0.014478005694182528,
"acc_norm": 0.45307167235494883,
"acc_norm_stderr": 0.01454689205200563
},
"harness|hellaswag|10": {
"acc": 0.5736904999004182,
"acc_stderr": 0.004935291975579197,
"acc_norm": 0.7461661023700458,
"acc_norm_stderr": 0.004343142545094253
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.04017901275981749,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.04017901275981749
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4226415094339623,
"acc_stderr": 0.030402331445769537,
"acc_norm": 0.4226415094339623,
"acc_norm_stderr": 0.030402331445769537
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.04140685639111502,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.04140685639111502
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.34104046242774566,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.34104046242774566,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.03202563076101736,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03202563076101736
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.04227054451232199,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.04227054451232199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30687830687830686,
"acc_stderr": 0.02375292871211214,
"acc_norm": 0.30687830687830686,
"acc_norm_stderr": 0.02375292871211214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790605,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.45483870967741935,
"acc_stderr": 0.028327743091561067,
"acc_norm": 0.45483870967741935,
"acc_norm_stderr": 0.028327743091561067
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.24630541871921183,
"acc_stderr": 0.030315099285617732,
"acc_norm": 0.24630541871921183,
"acc_norm_stderr": 0.030315099285617732
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.038154943086889305,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.038154943086889305
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4494949494949495,
"acc_stderr": 0.0354413249194797,
"acc_norm": 0.4494949494949495,
"acc_norm_stderr": 0.0354413249194797
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5906735751295337,
"acc_stderr": 0.03548608168860806,
"acc_norm": 0.5906735751295337,
"acc_norm_stderr": 0.03548608168860806
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34102564102564104,
"acc_stderr": 0.024035489676335065,
"acc_norm": 0.34102564102564104,
"acc_norm_stderr": 0.024035489676335065
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230193,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230193
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3739495798319328,
"acc_stderr": 0.031429466378837076,
"acc_norm": 0.3739495798319328,
"acc_norm_stderr": 0.031429466378837076
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5724770642201835,
"acc_stderr": 0.021210910204300434,
"acc_norm": 0.5724770642201835,
"acc_norm_stderr": 0.021210910204300434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.03085199299325701,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.03085199299325701
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5931372549019608,
"acc_stderr": 0.03447891136353382,
"acc_norm": 0.5931372549019608,
"acc_norm_stderr": 0.03447891136353382
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6329113924050633,
"acc_stderr": 0.031376240725616185,
"acc_norm": 0.6329113924050633,
"acc_norm_stderr": 0.031376240725616185
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4618834080717489,
"acc_stderr": 0.03346015011973228,
"acc_norm": 0.4618834080717489,
"acc_norm_stderr": 0.03346015011973228
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5038167938931297,
"acc_stderr": 0.043851623256015534,
"acc_norm": 0.5038167938931297,
"acc_norm_stderr": 0.043851623256015534
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.48760330578512395,
"acc_stderr": 0.04562951548180765,
"acc_norm": 0.48760330578512395,
"acc_norm_stderr": 0.04562951548180765
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3803680981595092,
"acc_stderr": 0.03814269893261836,
"acc_norm": 0.3803680981595092,
"acc_norm_stderr": 0.03814269893261836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.49514563106796117,
"acc_stderr": 0.049505043821289195,
"acc_norm": 0.49514563106796117,
"acc_norm_stderr": 0.049505043821289195
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5726495726495726,
"acc_stderr": 0.032408473935163266,
"acc_norm": 0.5726495726495726,
"acc_norm_stderr": 0.032408473935163266
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5670498084291188,
"acc_stderr": 0.017718469101513982,
"acc_norm": 0.5670498084291188,
"acc_norm_stderr": 0.017718469101513982
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4046242774566474,
"acc_stderr": 0.026424816594009852,
"acc_norm": 0.4046242774566474,
"acc_norm_stderr": 0.026424816594009852
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2860335195530726,
"acc_stderr": 0.015113972129062127,
"acc_norm": 0.2860335195530726,
"acc_norm_stderr": 0.015113972129062127
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.02849199358617156,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.02849199358617156
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4855305466237942,
"acc_stderr": 0.02838619808417768,
"acc_norm": 0.4855305466237942,
"acc_norm_stderr": 0.02838619808417768
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.43209876543209874,
"acc_stderr": 0.027563010971606676,
"acc_norm": 0.43209876543209874,
"acc_norm_stderr": 0.027563010971606676
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.28368794326241137,
"acc_stderr": 0.026891709428343957,
"acc_norm": 0.28368794326241137,
"acc_norm_stderr": 0.026891709428343957
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3324641460234681,
"acc_stderr": 0.012032022332260518,
"acc_norm": 0.3324641460234681,
"acc_norm_stderr": 0.012032022332260518
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3492647058823529,
"acc_stderr": 0.028959755196824866,
"acc_norm": 0.3492647058823529,
"acc_norm_stderr": 0.028959755196824866
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.020087362076702853,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.020087362076702853
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5727272727272728,
"acc_stderr": 0.04738198703545483,
"acc_norm": 0.5727272727272728,
"acc_norm_stderr": 0.04738198703545483
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.39591836734693875,
"acc_stderr": 0.03130802899065686,
"acc_norm": 0.39591836734693875,
"acc_norm_stderr": 0.03130802899065686
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5970149253731343,
"acc_stderr": 0.034683432951111266,
"acc_norm": 0.5970149253731343,
"acc_norm_stderr": 0.034683432951111266
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.038194861407583984,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.038194861407583984
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5380116959064327,
"acc_stderr": 0.03823727092882307,
"acc_norm": 0.5380116959064327,
"acc_norm_stderr": 0.03823727092882307
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21542227662178703,
"mc1_stderr": 0.01439190265242768,
"mc2": 0.3542683630494196,
"mc2_stderr": 0.01528351213467969
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_mosaicml__mpt-7b-8k-chat | 2023-10-03T22:40:47.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of mosaicml/mpt-7b-8k-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mosaicml/mpt-7b-8k-chat](https://huggingface.co/mosaicml/mpt-7b-8k-chat) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mosaicml__mpt-7b-8k-chat\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T22:39:26.235100](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-7b-8k-chat/blob/main/results_2023-10-03T22-39-26.235100.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4176139970228506,\n\
\ \"acc_stderr\": 0.03519122289888466,\n \"acc_norm\": 0.4218812138341268,\n\
\ \"acc_norm_stderr\": 0.03518089162251733,\n \"mc1\": 0.2876376988984088,\n\
\ \"mc1_stderr\": 0.01584631510139481,\n \"mc2\": 0.43651464403110796,\n\
\ \"mc2_stderr\": 0.014875310662595358\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.42406143344709896,\n \"acc_stderr\": 0.014441889627464396,\n\
\ \"acc_norm\": 0.47696245733788395,\n \"acc_norm_stderr\": 0.014595873205358269\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5759808803027285,\n\
\ \"acc_stderr\": 0.004931831953800041,\n \"acc_norm\": 0.7748456482772356,\n\
\ \"acc_norm_stderr\": 0.004168303070233535\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.040089737857792046,\n\
\ \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.040089737857792046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4641509433962264,\n \"acc_stderr\": 0.030693675018458003,\n\
\ \"acc_norm\": 0.4641509433962264,\n \"acc_norm_stderr\": 0.030693675018458003\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3958333333333333,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.3958333333333333,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.42,\n\
\ \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n \
\ \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3236994219653179,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.3236994219653179,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929775,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929775\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.023517294335963286,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.023517294335963286\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.43548387096774194,\n \"acc_stderr\": 0.02820622559150274,\n \"\
acc_norm\": 0.43548387096774194,\n \"acc_norm_stderr\": 0.02820622559150274\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233485,\n \"\
acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233485\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\"\
: 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.038881769216741004,\n\
\ \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.038881769216741004\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.51010101010101,\n \"acc_stderr\": 0.035616254886737454,\n \"acc_norm\"\
: 0.51010101010101,\n \"acc_norm_stderr\": 0.035616254886737454\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.6113989637305699,\n \"acc_stderr\": 0.035177397963731316,\n\
\ \"acc_norm\": 0.6113989637305699,\n \"acc_norm_stderr\": 0.035177397963731316\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.37435897435897436,\n \"acc_stderr\": 0.02453759157283052,\n\
\ \"acc_norm\": 0.37435897435897436,\n \"acc_norm_stderr\": 0.02453759157283052\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945287,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945287\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3697478991596639,\n \"acc_stderr\": 0.031357095996135904,\n\
\ \"acc_norm\": 0.3697478991596639,\n \"acc_norm_stderr\": 0.031357095996135904\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5064220183486239,\n \"acc_stderr\": 0.021435554820013077,\n \"\
acc_norm\": 0.5064220183486239,\n \"acc_norm_stderr\": 0.021435554820013077\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.27314814814814814,\n \"acc_stderr\": 0.030388051301678116,\n \"\
acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.030388051301678116\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5147058823529411,\n \"acc_stderr\": 0.03507793834791324,\n \"\
acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03507793834791324\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5569620253164557,\n \"acc_stderr\": 0.032335327775334835,\n \
\ \"acc_norm\": 0.5569620253164557,\n \"acc_norm_stderr\": 0.032335327775334835\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5022421524663677,\n\
\ \"acc_stderr\": 0.033557465352232634,\n \"acc_norm\": 0.5022421524663677,\n\
\ \"acc_norm_stderr\": 0.033557465352232634\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5038167938931297,\n \"acc_stderr\": 0.043851623256015534,\n\
\ \"acc_norm\": 0.5038167938931297,\n \"acc_norm_stderr\": 0.043851623256015534\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5206611570247934,\n \"acc_stderr\": 0.04560456086387235,\n \"\
acc_norm\": 0.5206611570247934,\n \"acc_norm_stderr\": 0.04560456086387235\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n\
\ \"acc_stderr\": 0.04812917324536824,\n \"acc_norm\": 0.5462962962962963,\n\
\ \"acc_norm_stderr\": 0.04812917324536824\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4110429447852761,\n \"acc_stderr\": 0.038656978537853624,\n\
\ \"acc_norm\": 0.4110429447852761,\n \"acc_norm_stderr\": 0.038656978537853624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4854368932038835,\n \"acc_stderr\": 0.04948637324026637,\n\
\ \"acc_norm\": 0.4854368932038835,\n \"acc_norm_stderr\": 0.04948637324026637\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5982905982905983,\n\
\ \"acc_stderr\": 0.03211693751051621,\n \"acc_norm\": 0.5982905982905983,\n\
\ \"acc_norm_stderr\": 0.03211693751051621\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.558109833971903,\n\
\ \"acc_stderr\": 0.017758800534214417,\n \"acc_norm\": 0.558109833971903,\n\
\ \"acc_norm_stderr\": 0.017758800534214417\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4161849710982659,\n \"acc_stderr\": 0.02653818910470547,\n\
\ \"acc_norm\": 0.4161849710982659,\n \"acc_norm_stderr\": 0.02653818910470547\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217892,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217892\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.41830065359477125,\n \"acc_stderr\": 0.02824513402438729,\n\
\ \"acc_norm\": 0.41830065359477125,\n \"acc_norm_stderr\": 0.02824513402438729\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4630225080385852,\n\
\ \"acc_stderr\": 0.02832032583010592,\n \"acc_norm\": 0.4630225080385852,\n\
\ \"acc_norm_stderr\": 0.02832032583010592\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.027513747284379424,\n\
\ \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.027513747284379424\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.30141843971631205,\n \"acc_stderr\": 0.027374128882631153,\n \
\ \"acc_norm\": 0.30141843971631205,\n \"acc_norm_stderr\": 0.027374128882631153\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3239895697522816,\n\
\ \"acc_stderr\": 0.011952840809646575,\n \"acc_norm\": 0.3239895697522816,\n\
\ \"acc_norm_stderr\": 0.011952840809646575\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3161764705882353,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.3161764705882353,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4166666666666667,\n \"acc_stderr\": 0.019944914136873586,\n \
\ \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.019944914136873586\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.35918367346938773,\n \"acc_stderr\": 0.030713560455108493,\n\
\ \"acc_norm\": 0.35918367346938773,\n \"acc_norm_stderr\": 0.030713560455108493\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5074626865671642,\n\
\ \"acc_stderr\": 0.03535140084276719,\n \"acc_norm\": 0.5074626865671642,\n\
\ \"acc_norm_stderr\": 0.03535140084276719\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5672514619883041,\n \"acc_stderr\": 0.03799978644370607,\n\
\ \"acc_norm\": 0.5672514619883041,\n \"acc_norm_stderr\": 0.03799978644370607\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2876376988984088,\n\
\ \"mc1_stderr\": 0.01584631510139481,\n \"mc2\": 0.43651464403110796,\n\
\ \"mc2_stderr\": 0.014875310662595358\n }\n}\n```"
repo_url: https://huggingface.co/mosaicml/mpt-7b-8k-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|arc:challenge|25_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|arc:challenge|25_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hellaswag|10_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hellaswag|10_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T22:52:00.675121.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T22-39-26.235100.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-39-26.235100.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T22_52_00.675121
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T22:52:00.675121.parquet'
- split: 2023_10_03T22_39_26.235100
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T22-39-26.235100.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T22-39-26.235100.parquet'
- config_name: results
data_files:
- split: 2023_10_03T22_39_26.235100
path:
- results_2023-10-03T22-39-26.235100.parquet
- split: latest
path:
- results_2023-10-03T22-39-26.235100.parquet
---
# Dataset Card for Evaluation run of mosaicml/mpt-7b-8k-chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mosaicml/mpt-7b-8k-chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mosaicml/mpt-7b-8k-chat](https://huggingface.co/mosaicml/mpt-7b-8k-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mosaicml__mpt-7b-8k-chat",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T22:39:26.235100](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-7b-8k-chat/blob/main/results_2023-10-03T22-39-26.235100.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4176139970228506,
"acc_stderr": 0.03519122289888466,
"acc_norm": 0.4218812138341268,
"acc_norm_stderr": 0.03518089162251733,
"mc1": 0.2876376988984088,
"mc1_stderr": 0.01584631510139481,
"mc2": 0.43651464403110796,
"mc2_stderr": 0.014875310662595358
},
"harness|arc:challenge|25": {
"acc": 0.42406143344709896,
"acc_stderr": 0.014441889627464396,
"acc_norm": 0.47696245733788395,
"acc_norm_stderr": 0.014595873205358269
},
"harness|hellaswag|10": {
"acc": 0.5759808803027285,
"acc_stderr": 0.004931831953800041,
"acc_norm": 0.7748456482772356,
"acc_norm_stderr": 0.004168303070233535
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4144736842105263,
"acc_stderr": 0.040089737857792046,
"acc_norm": 0.4144736842105263,
"acc_norm_stderr": 0.040089737857792046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4641509433962264,
"acc_stderr": 0.030693675018458003,
"acc_norm": 0.4641509433962264,
"acc_norm_stderr": 0.030693675018458003
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3958333333333333,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.3958333333333333,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929775,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929775
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.023517294335963286,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.023517294335963286
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.43548387096774194,
"acc_stderr": 0.02820622559150274,
"acc_norm": 0.43548387096774194,
"acc_norm_stderr": 0.02820622559150274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233485,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233485
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.038881769216741004,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.038881769216741004
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.51010101010101,
"acc_stderr": 0.035616254886737454,
"acc_norm": 0.51010101010101,
"acc_norm_stderr": 0.035616254886737454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6113989637305699,
"acc_stderr": 0.035177397963731316,
"acc_norm": 0.6113989637305699,
"acc_norm_stderr": 0.035177397963731316
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.37435897435897436,
"acc_stderr": 0.02453759157283052,
"acc_norm": 0.37435897435897436,
"acc_norm_stderr": 0.02453759157283052
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945287,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945287
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3697478991596639,
"acc_stderr": 0.031357095996135904,
"acc_norm": 0.3697478991596639,
"acc_norm_stderr": 0.031357095996135904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5064220183486239,
"acc_stderr": 0.021435554820013077,
"acc_norm": 0.5064220183486239,
"acc_norm_stderr": 0.021435554820013077
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.27314814814814814,
"acc_stderr": 0.030388051301678116,
"acc_norm": 0.27314814814814814,
"acc_norm_stderr": 0.030388051301678116
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.03507793834791324,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.03507793834791324
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5569620253164557,
"acc_stderr": 0.032335327775334835,
"acc_norm": 0.5569620253164557,
"acc_norm_stderr": 0.032335327775334835
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5022421524663677,
"acc_stderr": 0.033557465352232634,
"acc_norm": 0.5022421524663677,
"acc_norm_stderr": 0.033557465352232634
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5038167938931297,
"acc_stderr": 0.043851623256015534,
"acc_norm": 0.5038167938931297,
"acc_norm_stderr": 0.043851623256015534
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5206611570247934,
"acc_stderr": 0.04560456086387235,
"acc_norm": 0.5206611570247934,
"acc_norm_stderr": 0.04560456086387235
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536824,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536824
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4110429447852761,
"acc_stderr": 0.038656978537853624,
"acc_norm": 0.4110429447852761,
"acc_norm_stderr": 0.038656978537853624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.4854368932038835,
"acc_stderr": 0.04948637324026637,
"acc_norm": 0.4854368932038835,
"acc_norm_stderr": 0.04948637324026637
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5982905982905983,
"acc_stderr": 0.03211693751051621,
"acc_norm": 0.5982905982905983,
"acc_norm_stderr": 0.03211693751051621
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.558109833971903,
"acc_stderr": 0.017758800534214417,
"acc_norm": 0.558109833971903,
"acc_norm_stderr": 0.017758800534214417
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4161849710982659,
"acc_stderr": 0.02653818910470547,
"acc_norm": 0.4161849710982659,
"acc_norm_stderr": 0.02653818910470547
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217892,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217892
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.41830065359477125,
"acc_stderr": 0.02824513402438729,
"acc_norm": 0.41830065359477125,
"acc_norm_stderr": 0.02824513402438729
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4630225080385852,
"acc_stderr": 0.02832032583010592,
"acc_norm": 0.4630225080385852,
"acc_norm_stderr": 0.02832032583010592
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.027513747284379424,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.027513747284379424
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.30141843971631205,
"acc_stderr": 0.027374128882631153,
"acc_norm": 0.30141843971631205,
"acc_norm_stderr": 0.027374128882631153
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3239895697522816,
"acc_stderr": 0.011952840809646575,
"acc_norm": 0.3239895697522816,
"acc_norm_stderr": 0.011952840809646575
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3161764705882353,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.3161764705882353,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.019944914136873586,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.019944914136873586
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.35918367346938773,
"acc_stderr": 0.030713560455108493,
"acc_norm": 0.35918367346938773,
"acc_norm_stderr": 0.030713560455108493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5074626865671642,
"acc_stderr": 0.03535140084276719,
"acc_norm": 0.5074626865671642,
"acc_norm_stderr": 0.03535140084276719
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5672514619883041,
"acc_stderr": 0.03799978644370607,
"acc_norm": 0.5672514619883041,
"acc_norm_stderr": 0.03799978644370607
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2876376988984088,
"mc1_stderr": 0.01584631510139481,
"mc2": 0.43651464403110796,
"mc2_stderr": 0.014875310662595358
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_mosaicml__mpt-7b | 2023-10-03T22:11:49.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of mosaicml/mpt-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mosaicml/mpt-7b](https://huggingface.co/mosaicml/mpt-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 122 configuration, each one coresponding to one of\
\ the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can\
\ be found as a specific split in each configuration, the split being named using\
\ the timestamp of the run.The \"train\" split is always pointing to the latest\
\ results.\n\nAn additional configuration \"results\" store all the aggregated results\
\ of the run (and is used to compute and display the agregated metrics on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mosaicml__mpt-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T22:10:31.153532](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-7b/blob/main/results_2023-10-03T22-10-31.153532.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.28815728428182913,\n\
\ \"acc_stderr\": 0.032729017222815425,\n \"acc_norm\": 0.2923951167846347,\n\
\ \"acc_norm_stderr\": 0.032718180607395383,\n \"mc1\": 0.20563035495716034,\n\
\ \"mc1_stderr\": 0.014148482219460974,\n \"mc2\": 0.3354506043570123,\n\
\ \"mc2_stderr\": 0.013110323313593984\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.42918088737201365,\n \"acc_stderr\": 0.014464085894870653,\n\
\ \"acc_norm\": 0.47696245733788395,\n \"acc_norm_stderr\": 0.014595873205358269\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5730930093606851,\n\
\ \"acc_stderr\": 0.004936176784631949,\n \"acc_norm\": 0.7753435570603465,\n\
\ \"acc_norm_stderr\": 0.0041650291643616005\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036624\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.035914440841969694,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.035914440841969694\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.03583496176361062,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.03583496176361062\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.28679245283018867,\n \"acc_stderr\": 0.027834912527544067,\n\
\ \"acc_norm\": 0.28679245283018867,\n \"acc_norm_stderr\": 0.027834912527544067\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2847222222222222,\n\
\ \"acc_stderr\": 0.03773809990686935,\n \"acc_norm\": 0.2847222222222222,\n\
\ \"acc_norm_stderr\": 0.03773809990686935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2658959537572254,\n\
\ \"acc_stderr\": 0.033687629322594295,\n \"acc_norm\": 0.2658959537572254,\n\
\ \"acc_norm_stderr\": 0.033687629322594295\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3404255319148936,\n \"acc_stderr\": 0.03097669299853442,\n\
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.03097669299853442\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n\
\ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23809523809523808,\n \"acc_stderr\": 0.021935878081184763,\n \"\
acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.021935878081184763\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.03764950879790605,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.03764950879790605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n\
\ \"acc_stderr\": 0.024685979286239952,\n \"acc_norm\": 0.25161290322580643,\n\
\ \"acc_norm_stderr\": 0.024685979286239952\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.20689655172413793,\n \"acc_stderr\": 0.02850137816789395,\n\
\ \"acc_norm\": 0.20689655172413793,\n \"acc_norm_stderr\": 0.02850137816789395\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.033464098810559534,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.033464098810559534\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.02962022787479047,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02962022787479047\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.35751295336787564,\n \"acc_stderr\": 0.03458816042181006,\n\
\ \"acc_norm\": 0.35751295336787564,\n \"acc_norm_stderr\": 0.03458816042181006\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.32051282051282054,\n \"acc_stderr\": 0.02366129639396427,\n\
\ \"acc_norm\": 0.32051282051282054,\n \"acc_norm_stderr\": 0.02366129639396427\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340496,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.28991596638655465,\n \"acc_stderr\": 0.029472485833136098,\n\
\ \"acc_norm\": 0.28991596638655465,\n \"acc_norm_stderr\": 0.029472485833136098\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.26055045871559634,\n \"acc_stderr\": 0.018819182034850068,\n \"\
acc_norm\": 0.26055045871559634,\n \"acc_norm_stderr\": 0.018819182034850068\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3055555555555556,\n \"acc_stderr\": 0.03141554629402544,\n \"\
acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.03141554629402544\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604257,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604257\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3094170403587444,\n\
\ \"acc_stderr\": 0.031024411740572203,\n \"acc_norm\": 0.3094170403587444,\n\
\ \"acc_norm_stderr\": 0.031024411740572203\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.29770992366412213,\n \"acc_stderr\": 0.04010358942462203,\n\
\ \"acc_norm\": 0.29770992366412213,\n \"acc_norm_stderr\": 0.04010358942462203\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2975206611570248,\n \"acc_stderr\": 0.04173349148083498,\n \"\
acc_norm\": 0.2975206611570248,\n \"acc_norm_stderr\": 0.04173349148083498\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.23300970873786409,\n \"acc_stderr\": 0.041858325989283136,\n\
\ \"acc_norm\": 0.23300970873786409,\n \"acc_norm_stderr\": 0.041858325989283136\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.32051282051282054,\n\
\ \"acc_stderr\": 0.03057281131029961,\n \"acc_norm\": 0.32051282051282054,\n\
\ \"acc_norm_stderr\": 0.03057281131029961\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3001277139208174,\n\
\ \"acc_stderr\": 0.016389249691317425,\n \"acc_norm\": 0.3001277139208174,\n\
\ \"acc_norm_stderr\": 0.016389249691317425\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2630057803468208,\n \"acc_stderr\": 0.023703099525258172,\n\
\ \"acc_norm\": 0.2630057803468208,\n \"acc_norm_stderr\": 0.023703099525258172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n\
\ \"acc_stderr\": 0.014378169884098423,\n \"acc_norm\": 0.2446927374301676,\n\
\ \"acc_norm_stderr\": 0.014378169884098423\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.28104575163398693,\n \"acc_stderr\": 0.025738854797818726,\n\
\ \"acc_norm\": 0.28104575163398693,\n \"acc_norm_stderr\": 0.025738854797818726\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.02600330111788513,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.02600330111788513\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.32098765432098764,\n \"acc_stderr\": 0.025976566010862737,\n\
\ \"acc_norm\": 0.32098765432098764,\n \"acc_norm_stderr\": 0.025976566010862737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290392,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290392\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2607561929595828,\n\
\ \"acc_stderr\": 0.011213471559602325,\n \"acc_norm\": 0.2607561929595828,\n\
\ \"acc_norm_stderr\": 0.011213471559602325\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.1948529411764706,\n \"acc_stderr\": 0.024060599423487414,\n\
\ \"acc_norm\": 0.1948529411764706,\n \"acc_norm_stderr\": 0.024060599423487414\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25980392156862747,\n \"acc_stderr\": 0.017740899509177788,\n \
\ \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.017740899509177788\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.33636363636363636,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.33636363636363636,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3020408163265306,\n \"acc_stderr\": 0.029393609319879818,\n\
\ \"acc_norm\": 0.3020408163265306,\n \"acc_norm_stderr\": 0.029393609319879818\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3493975903614458,\n\
\ \"acc_stderr\": 0.03711725190740749,\n \"acc_norm\": 0.3493975903614458,\n\
\ \"acc_norm_stderr\": 0.03711725190740749\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.20563035495716034,\n\
\ \"mc1_stderr\": 0.014148482219460974,\n \"mc2\": 0.3354506043570123,\n\
\ \"mc2_stderr\": 0.013110323313593984\n }\n}\n```"
repo_url: https://huggingface.co/mosaicml/mpt-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|arc:challenge|25_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|arc:challenge|25_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T17_09_43.658606
path:
- '**/details_harness|drop|3_2023-09-23T17-09-43.658606.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T17-09-43.658606.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T17_09_43.658606
path:
- '**/details_harness|gsm8k|5_2023-09-23T17-09-43.658606.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T17-09-43.658606.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hellaswag|10_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hellaswag|10_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T15:05:51.358534.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T22-10-31.153532.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T15_05_51.358534
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T15:05:51.358534.parquet'
- split: 2023_10_03T22_10_31.153532
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T22-10-31.153532.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T22-10-31.153532.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T17_09_43.658606
path:
- '**/details_harness|winogrande|5_2023-09-23T17-09-43.658606.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T17-09-43.658606.parquet'
- config_name: original_mmlu_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T20:09:40.976892.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_abstract_algebra_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_anatomy_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_astronomy_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_business_ethics_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_clinical_knowledge_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_college_biology_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_college_chemistry_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_college_computer_science_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_college_mathematics_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_college_medicine_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_college_physics_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_computer_security_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_conceptual_physics_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_econometrics_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_electrical_engineering_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_elementary_mathematics_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_formal_logic_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_global_facts_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_high_school_biology_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_high_school_chemistry_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_high_school_computer_science_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_high_school_european_history_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_high_school_geography_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_high_school_mathematics_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_high_school_microeconomics_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_high_school_physics_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_high_school_psychology_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_high_school_statistics_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_high_school_us_history_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_high_school_world_history_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_human_aging_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_human_sexuality_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_international_law_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_jurisprudence_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_logical_fallacies_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_machine_learning_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_management_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:management|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:management|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_marketing_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_medical_genetics_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_miscellaneous_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_moral_disputes_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_moral_scenarios_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_nutrition_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_philosophy_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_prehistory_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_professional_accounting_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_professional_law_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_professional_medicine_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_professional_psychology_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_public_relations_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_security_studies_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_sociology_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_us_foreign_policy_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_virology_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:virology|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:virology|5_2023-08-28T20:09:40.976892.parquet'
- config_name: original_mmlu_world_religions_5
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:09:40.976892.parquet'
- split: latest
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:09:40.976892.parquet'
- config_name: results
data_files:
- split: 2023_08_28T20_09_40.976892
path:
- results_2023-08-28T20:09:40.976892.parquet
- split: 2023_09_23T17_09_43.658606
path:
- results_2023-09-23T17-09-43.658606.parquet
- split: 2023_10_03T22_10_31.153532
path:
- results_2023-10-03T22-10-31.153532.parquet
- split: latest
path:
- results_2023-10-03T22-10-31.153532.parquet
---
# Dataset Card for Evaluation run of mosaicml/mpt-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mosaicml/mpt-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mosaicml/mpt-7b](https://huggingface.co/mosaicml/mpt-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mosaicml__mpt-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T22:10:31.153532](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-7b/blob/main/results_2023-10-03T22-10-31.153532.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.28815728428182913,
"acc_stderr": 0.032729017222815425,
"acc_norm": 0.2923951167846347,
"acc_norm_stderr": 0.032718180607395383,
"mc1": 0.20563035495716034,
"mc1_stderr": 0.014148482219460974,
"mc2": 0.3354506043570123,
"mc2_stderr": 0.013110323313593984
},
"harness|arc:challenge|25": {
"acc": 0.42918088737201365,
"acc_stderr": 0.014464085894870653,
"acc_norm": 0.47696245733788395,
"acc_norm_stderr": 0.014595873205358269
},
"harness|hellaswag|10": {
"acc": 0.5730930093606851,
"acc_stderr": 0.004936176784631949,
"acc_norm": 0.7753435570603465,
"acc_norm_stderr": 0.0041650291643616005
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.035914440841969694,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.035914440841969694
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.03583496176361062,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.03583496176361062
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.28679245283018867,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.28679245283018867,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2847222222222222,
"acc_stderr": 0.03773809990686935,
"acc_norm": 0.2847222222222222,
"acc_norm_stderr": 0.03773809990686935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.033687629322594295,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.033687629322594295
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.03097669299853442,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.03097669299853442
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.021935878081184763,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.021935878081184763
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790605,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.024685979286239952,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.024685979286239952
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.20689655172413793,
"acc_stderr": 0.02850137816789395,
"acc_norm": 0.20689655172413793,
"acc_norm_stderr": 0.02850137816789395
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.033464098810559534,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.033464098810559534
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02962022787479047,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02962022787479047
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.35751295336787564,
"acc_stderr": 0.03458816042181006,
"acc_norm": 0.35751295336787564,
"acc_norm_stderr": 0.03458816042181006
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.32051282051282054,
"acc_stderr": 0.02366129639396427,
"acc_norm": 0.32051282051282054,
"acc_norm_stderr": 0.02366129639396427
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340496,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.28991596638655465,
"acc_stderr": 0.029472485833136098,
"acc_norm": 0.28991596638655465,
"acc_norm_stderr": 0.029472485833136098
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26055045871559634,
"acc_stderr": 0.018819182034850068,
"acc_norm": 0.26055045871559634,
"acc_norm_stderr": 0.018819182034850068
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.03141554629402544,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.03141554629402544
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604257,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604257
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3094170403587444,
"acc_stderr": 0.031024411740572203,
"acc_norm": 0.3094170403587444,
"acc_norm_stderr": 0.031024411740572203
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.29770992366412213,
"acc_stderr": 0.04010358942462203,
"acc_norm": 0.29770992366412213,
"acc_norm_stderr": 0.04010358942462203
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2975206611570248,
"acc_stderr": 0.04173349148083498,
"acc_norm": 0.2975206611570248,
"acc_norm_stderr": 0.04173349148083498
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.23300970873786409,
"acc_stderr": 0.041858325989283136,
"acc_norm": 0.23300970873786409,
"acc_norm_stderr": 0.041858325989283136
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.32051282051282054,
"acc_stderr": 0.03057281131029961,
"acc_norm": 0.32051282051282054,
"acc_norm_stderr": 0.03057281131029961
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3001277139208174,
"acc_stderr": 0.016389249691317425,
"acc_norm": 0.3001277139208174,
"acc_norm_stderr": 0.016389249691317425
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2630057803468208,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.2630057803468208,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2446927374301676,
"acc_stderr": 0.014378169884098423,
"acc_norm": 0.2446927374301676,
"acc_norm_stderr": 0.014378169884098423
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.28104575163398693,
"acc_stderr": 0.025738854797818726,
"acc_norm": 0.28104575163398693,
"acc_norm_stderr": 0.025738854797818726
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.02600330111788513,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.02600330111788513
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.32098765432098764,
"acc_stderr": 0.025976566010862737,
"acc_norm": 0.32098765432098764,
"acc_norm_stderr": 0.025976566010862737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290392,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290392
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2607561929595828,
"acc_stderr": 0.011213471559602325,
"acc_norm": 0.2607561929595828,
"acc_norm_stderr": 0.011213471559602325
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1948529411764706,
"acc_stderr": 0.024060599423487414,
"acc_norm": 0.1948529411764706,
"acc_norm_stderr": 0.024060599423487414
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.017740899509177788,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.017740899509177788
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.33636363636363636,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.33636363636363636,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3020408163265306,
"acc_stderr": 0.029393609319879818,
"acc_norm": 0.3020408163265306,
"acc_norm_stderr": 0.029393609319879818
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3493975903614458,
"acc_stderr": 0.03711725190740749,
"acc_norm": 0.3493975903614458,
"acc_norm_stderr": 0.03711725190740749
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.20563035495716034,
"mc1_stderr": 0.014148482219460974,
"mc2": 0.3354506043570123,
"mc2_stderr": 0.013110323313593984
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_quantumaikr__QuantumLM | 2023-08-27T12:44:14.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of quantumaikr/QuantumLM
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [quantumaikr/QuantumLM](https://huggingface.co/quantumaikr/QuantumLM) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_quantumaikr__QuantumLM\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-22T20:06:17.327995](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__QuantumLM/blob/main/results_2023-08-22T20%3A06%3A17.327995.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5420843520573295,\n\
\ \"acc_stderr\": 0.03473892519148735,\n \"acc_norm\": 0.5462691332390492,\n\
\ \"acc_norm_stderr\": 0.03472239225051118,\n \"mc1\": 0.31211750305997554,\n\
\ \"mc1_stderr\": 0.016220756769520932,\n \"mc2\": 0.4671087939545944,\n\
\ \"mc2_stderr\": 0.014916309036247634\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5093856655290102,\n \"acc_stderr\": 0.014608816322065,\n\
\ \"acc_norm\": 0.5580204778156996,\n \"acc_norm_stderr\": 0.014512682523128343\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5990838478390759,\n\
\ \"acc_stderr\": 0.004890824718530301,\n \"acc_norm\": 0.7973511252738499,\n\
\ \"acc_norm_stderr\": 0.004011514999872581\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249033,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249033\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5622641509433962,\n \"acc_stderr\": 0.030533338430467523,\n\
\ \"acc_norm\": 0.5622641509433962,\n \"acc_norm_stderr\": 0.030533338430467523\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.5347222222222222,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.41040462427745666,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.41040462427745666,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n\
\ \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.024229965298425075,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.024229965298425075\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.632258064516129,\n\
\ \"acc_stderr\": 0.027430866579973467,\n \"acc_norm\": 0.632258064516129,\n\
\ \"acc_norm_stderr\": 0.027430866579973467\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.0345245390382204,\n\
\ \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.0345245390382204\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.03742597043806587,\n\
\ \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.03742597043806587\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7616580310880829,\n \"acc_stderr\": 0.030748905363909895,\n\
\ \"acc_norm\": 0.7616580310880829,\n \"acc_norm_stderr\": 0.030748905363909895\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4948717948717949,\n \"acc_stderr\": 0.02534967290683865,\n \
\ \"acc_norm\": 0.4948717948717949,\n \"acc_norm_stderr\": 0.02534967290683865\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028604,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028604\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03242225027115007,\n \
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03242225027115007\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.726605504587156,\n \"acc_stderr\": 0.01910929984609828,\n \"acc_norm\"\
: 0.726605504587156,\n \"acc_norm_stderr\": 0.01910929984609828\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.39351851851851855,\n\
\ \"acc_stderr\": 0.03331747876370312,\n \"acc_norm\": 0.39351851851851855,\n\
\ \"acc_norm_stderr\": 0.03331747876370312\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7303921568627451,\n \"acc_stderr\": 0.031145570659486782,\n\
\ \"acc_norm\": 0.7303921568627451,\n \"acc_norm_stderr\": 0.031145570659486782\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7046413502109705,\n \"acc_stderr\": 0.029696338713422886,\n \
\ \"acc_norm\": 0.7046413502109705,\n \"acc_norm_stderr\": 0.029696338713422886\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.039849796533028725,\n \"\
acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.039849796533028725\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.02685345037700917,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.02685345037700917\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7420178799489144,\n\
\ \"acc_stderr\": 0.01564583018834895,\n \"acc_norm\": 0.7420178799489144,\n\
\ \"acc_norm_stderr\": 0.01564583018834895\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.026226158605124658,\n\
\ \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.026226158605124658\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35307262569832404,\n\
\ \"acc_stderr\": 0.015984204545268565,\n \"acc_norm\": 0.35307262569832404,\n\
\ \"acc_norm_stderr\": 0.015984204545268565\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.028074158947600653,\n\
\ \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.028074158947600653\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n\
\ \"acc_stderr\": 0.027648149599751468,\n \"acc_norm\": 0.6141479099678456,\n\
\ \"acc_norm_stderr\": 0.027648149599751468\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.027237415094592477,\n\
\ \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.027237415094592477\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35815602836879434,\n \"acc_stderr\": 0.028602085862759415,\n \
\ \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.028602085862759415\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.37222946544980445,\n\
\ \"acc_stderr\": 0.01234624129720437,\n \"acc_norm\": 0.37222946544980445,\n\
\ \"acc_norm_stderr\": 0.01234624129720437\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5036764705882353,\n \"acc_stderr\": 0.030372015885428195,\n\
\ \"acc_norm\": 0.5036764705882353,\n \"acc_norm_stderr\": 0.030372015885428195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5130718954248366,\n \"acc_stderr\": 0.020220920829626916,\n \
\ \"acc_norm\": 0.5130718954248366,\n \"acc_norm_stderr\": 0.020220920829626916\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.03125127591089166,\n\
\ \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.03125127591089166\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7114427860696517,\n\
\ \"acc_stderr\": 0.03203841040213322,\n \"acc_norm\": 0.7114427860696517,\n\
\ \"acc_norm_stderr\": 0.03203841040213322\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.033773102522092056,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.033773102522092056\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31211750305997554,\n\
\ \"mc1_stderr\": 0.016220756769520932,\n \"mc2\": 0.4671087939545944,\n\
\ \"mc2_stderr\": 0.014916309036247634\n }\n}\n```"
repo_url: https://huggingface.co/quantumaikr/QuantumLM
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|arc:challenge|25_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|arc:challenge|25_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hellaswag|10_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hellaswag|10_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T12:43:24.978331.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T20:06:17.327995.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T20:06:17.327995.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T12_43_24.978331
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T12:43:24.978331.parquet'
- split: 2023_08_22T20_06_17.327995
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T20:06:17.327995.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T20:06:17.327995.parquet'
---
# Dataset Card for Evaluation run of quantumaikr/QuantumLM
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/quantumaikr/QuantumLM
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [quantumaikr/QuantumLM](https://huggingface.co/quantumaikr/QuantumLM) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_quantumaikr__QuantumLM",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-22T20:06:17.327995](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__QuantumLM/blob/main/results_2023-08-22T20%3A06%3A17.327995.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5420843520573295,
"acc_stderr": 0.03473892519148735,
"acc_norm": 0.5462691332390492,
"acc_norm_stderr": 0.03472239225051118,
"mc1": 0.31211750305997554,
"mc1_stderr": 0.016220756769520932,
"mc2": 0.4671087939545944,
"mc2_stderr": 0.014916309036247634
},
"harness|arc:challenge|25": {
"acc": 0.5093856655290102,
"acc_stderr": 0.014608816322065,
"acc_norm": 0.5580204778156996,
"acc_norm_stderr": 0.014512682523128343
},
"harness|hellaswag|10": {
"acc": 0.5990838478390759,
"acc_stderr": 0.004890824718530301,
"acc_norm": 0.7973511252738499,
"acc_norm_stderr": 0.004011514999872581
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5622641509433962,
"acc_stderr": 0.030533338430467523,
"acc_norm": 0.5622641509433962,
"acc_norm_stderr": 0.030533338430467523
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5347222222222222,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.5347222222222222,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4085106382978723,
"acc_stderr": 0.03213418026701576,
"acc_norm": 0.4085106382978723,
"acc_norm_stderr": 0.03213418026701576
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.024229965298425075,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.024229965298425075
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.632258064516129,
"acc_stderr": 0.027430866579973467,
"acc_norm": 0.632258064516129,
"acc_norm_stderr": 0.027430866579973467
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.0345245390382204,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.0345245390382204
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.03742597043806587,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.03742597043806587
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7616580310880829,
"acc_stderr": 0.030748905363909895,
"acc_norm": 0.7616580310880829,
"acc_norm_stderr": 0.030748905363909895
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4948717948717949,
"acc_stderr": 0.02534967290683865,
"acc_norm": 0.4948717948717949,
"acc_norm_stderr": 0.02534967290683865
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028604,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028604
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03242225027115007,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03242225027115007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.726605504587156,
"acc_stderr": 0.01910929984609828,
"acc_norm": 0.726605504587156,
"acc_norm_stderr": 0.01910929984609828
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39351851851851855,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.39351851851851855,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7303921568627451,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.7303921568627451,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7046413502109705,
"acc_stderr": 0.029696338713422886,
"acc_norm": 0.7046413502109705,
"acc_norm_stderr": 0.029696338713422886
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.039849796533028725,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.039849796533028725
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.02685345037700917,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.02685345037700917
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7420178799489144,
"acc_stderr": 0.01564583018834895,
"acc_norm": 0.7420178799489144,
"acc_norm_stderr": 0.01564583018834895
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.026226158605124658,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.026226158605124658
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35307262569832404,
"acc_stderr": 0.015984204545268565,
"acc_norm": 0.35307262569832404,
"acc_norm_stderr": 0.015984204545268565
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.028074158947600653,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.028074158947600653
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6141479099678456,
"acc_stderr": 0.027648149599751468,
"acc_norm": 0.6141479099678456,
"acc_norm_stderr": 0.027648149599751468
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.027237415094592477,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.027237415094592477
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35815602836879434,
"acc_stderr": 0.028602085862759415,
"acc_norm": 0.35815602836879434,
"acc_norm_stderr": 0.028602085862759415
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.37222946544980445,
"acc_stderr": 0.01234624129720437,
"acc_norm": 0.37222946544980445,
"acc_norm_stderr": 0.01234624129720437
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5036764705882353,
"acc_stderr": 0.030372015885428195,
"acc_norm": 0.5036764705882353,
"acc_norm_stderr": 0.030372015885428195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5130718954248366,
"acc_stderr": 0.020220920829626916,
"acc_norm": 0.5130718954248366,
"acc_norm_stderr": 0.020220920829626916
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6081632653061224,
"acc_stderr": 0.03125127591089166,
"acc_norm": 0.6081632653061224,
"acc_norm_stderr": 0.03125127591089166
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7114427860696517,
"acc_stderr": 0.03203841040213322,
"acc_norm": 0.7114427860696517,
"acc_norm_stderr": 0.03203841040213322
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.033773102522092056,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.033773102522092056
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31211750305997554,
"mc1_stderr": 0.016220756769520932,
"mc2": 0.4671087939545944,
"mc2_stderr": 0.014916309036247634
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_rombodawg__LosslessMegaCoder-llama2-13b-mini | 2023-09-17T11:42:14.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of rombodawg/LosslessMegaCoder-llama2-13b-mini
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [rombodawg/LosslessMegaCoder-llama2-13b-mini](https://huggingface.co/rombodawg/LosslessMegaCoder-llama2-13b-mini)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rombodawg__LosslessMegaCoder-llama2-13b-mini\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T11:42:02.372099](https://huggingface.co/datasets/open-llm-leaderboard/details_rombodawg__LosslessMegaCoder-llama2-13b-mini/blob/main/results_2023-09-17T11-42-02.372099.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0030411073825503355,\n\
\ \"em_stderr\": 0.0005638896908753115,\n \"f1\": 0.07890205536912773,\n\
\ \"f1_stderr\": 0.0016368809848969982,\n \"acc\": 0.4643729284759866,\n\
\ \"acc_stderr\": 0.010956919441194278\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0030411073825503355,\n \"em_stderr\": 0.0005638896908753115,\n\
\ \"f1\": 0.07890205536912773,\n \"f1_stderr\": 0.0016368809848969982\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15921152388172857,\n \
\ \"acc_stderr\": 0.010077966717551878\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.01183587216483668\n\
\ }\n}\n```"
repo_url: https://huggingface.co/rombodawg/LosslessMegaCoder-llama2-13b-mini
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|arc:challenge|25_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T11_42_02.372099
path:
- '**/details_harness|drop|3_2023-09-17T11-42-02.372099.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T11-42-02.372099.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T11_42_02.372099
path:
- '**/details_harness|gsm8k|5_2023-09-17T11-42-02.372099.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T11-42-02.372099.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hellaswag|10_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T11_42_02.372099
path:
- '**/details_harness|winogrande|5_2023-09-17T11-42-02.372099.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T11-42-02.372099.parquet'
- config_name: results
data_files:
- split: 2023_09_17T11_42_02.372099
path:
- results_2023-09-17T11-42-02.372099.parquet
- split: latest
path:
- results_2023-09-17T11-42-02.372099.parquet
---
# Dataset Card for Evaluation run of rombodawg/LosslessMegaCoder-llama2-13b-mini
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/rombodawg/LosslessMegaCoder-llama2-13b-mini
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [rombodawg/LosslessMegaCoder-llama2-13b-mini](https://huggingface.co/rombodawg/LosslessMegaCoder-llama2-13b-mini) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rombodawg__LosslessMegaCoder-llama2-13b-mini",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T11:42:02.372099](https://huggingface.co/datasets/open-llm-leaderboard/details_rombodawg__LosslessMegaCoder-llama2-13b-mini/blob/main/results_2023-09-17T11-42-02.372099.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0030411073825503355,
"em_stderr": 0.0005638896908753115,
"f1": 0.07890205536912773,
"f1_stderr": 0.0016368809848969982,
"acc": 0.4643729284759866,
"acc_stderr": 0.010956919441194278
},
"harness|drop|3": {
"em": 0.0030411073825503355,
"em_stderr": 0.0005638896908753115,
"f1": 0.07890205536912773,
"f1_stderr": 0.0016368809848969982
},
"harness|gsm8k|5": {
"acc": 0.15921152388172857,
"acc_stderr": 0.010077966717551878
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.01183587216483668
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_LLMs__WizardLM-30B-V1.0 | 2023-08-27T12:44:20.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of LLMs/WizardLM-30B-V1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LLMs/WizardLM-30B-V1.0](https://huggingface.co/LLMs/WizardLM-30B-V1.0) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LLMs__WizardLM-30B-V1.0\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-22T13:36:33.189763](https://huggingface.co/datasets/open-llm-leaderboard/details_LLMs__WizardLM-30B-V1.0/blob/main/results_2023-08-22T13%3A36%3A33.189763.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.591455168527118,\n\
\ \"acc_stderr\": 0.033836006506247,\n \"acc_norm\": 0.5952085142108857,\n\
\ \"acc_norm_stderr\": 0.03381505162025332,\n \"mc1\": 0.36107711138310894,\n\
\ \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.5248739406679576,\n\
\ \"mc2_stderr\": 0.01580828169375083\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6032423208191127,\n \"acc_stderr\": 0.014296513020180647,\n\
\ \"acc_norm\": 0.6254266211604096,\n \"acc_norm_stderr\": 0.014144193471893456\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6334395538737303,\n\
\ \"acc_stderr\": 0.004808802114592843,\n \"acc_norm\": 0.8327026488747261,\n\
\ \"acc_norm_stderr\": 0.003724783389253327\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.5481481481481482,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.030151134457776285,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.030151134457776285\n \
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.039420826399272135,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.039420826399272135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.04598188057816541,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.04598188057816541\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159788,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159788\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6935483870967742,\n \"acc_stderr\": 0.026226485652553883,\n \"\
acc_norm\": 0.6935483870967742,\n \"acc_norm_stderr\": 0.026226485652553883\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.034819048444388045,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.034819048444388045\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.02614848346915332,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.02614848346915332\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.02493931390694079,\n \
\ \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.02493931390694079\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815642,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815642\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7908256880733945,\n \"acc_stderr\": 0.01743793717334323,\n \"\
acc_norm\": 0.7908256880733945,\n \"acc_norm_stderr\": 0.01743793717334323\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n\
\ \"acc_stderr\": 0.028379449451588663,\n \"acc_norm\": 0.7941176470588235,\n\
\ \"acc_norm_stderr\": 0.028379449451588663\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.02574490253229093,\n\
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.02574490253229093\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.031811497470553604,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.031811497470553604\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070415,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070415\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.04414343666854934,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.04414343666854934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077785,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077785\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7701149425287356,\n\
\ \"acc_stderr\": 0.01504630184669182,\n \"acc_norm\": 0.7701149425287356,\n\
\ \"acc_norm_stderr\": 0.01504630184669182\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.025522474632121612,\n\
\ \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.025522474632121612\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3776536312849162,\n\
\ \"acc_stderr\": 0.01621414875213663,\n \"acc_norm\": 0.3776536312849162,\n\
\ \"acc_norm_stderr\": 0.01621414875213663\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.027121956071388863,\n\
\ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.027121956071388863\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n\
\ \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n\
\ \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.025630824975621344,\n\
\ \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.025630824975621344\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4452411994784876,\n\
\ \"acc_stderr\": 0.012693421303973294,\n \"acc_norm\": 0.4452411994784876,\n\
\ \"acc_norm_stderr\": 0.012693421303973294\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.029722152099280065,\n\
\ \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.029722152099280065\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6437908496732027,\n \"acc_stderr\": 0.019373332420724493,\n \
\ \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.019373332420724493\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6693877551020408,\n \"acc_stderr\": 0.0301164262965406,\n\
\ \"acc_norm\": 0.6693877551020408,\n \"acc_norm_stderr\": 0.0301164262965406\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36107711138310894,\n\
\ \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.5248739406679576,\n\
\ \"mc2_stderr\": 0.01580828169375083\n }\n}\n```"
repo_url: https://huggingface.co/LLMs/WizardLM-30B-V1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|arc:challenge|25_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hellaswag|10_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T13:36:33.189763.parquet'
---
# Dataset Card for Evaluation run of LLMs/WizardLM-30B-V1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/LLMs/WizardLM-30B-V1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [LLMs/WizardLM-30B-V1.0](https://huggingface.co/LLMs/WizardLM-30B-V1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LLMs__WizardLM-30B-V1.0",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-22T13:36:33.189763](https://huggingface.co/datasets/open-llm-leaderboard/details_LLMs__WizardLM-30B-V1.0/blob/main/results_2023-08-22T13%3A36%3A33.189763.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.591455168527118,
"acc_stderr": 0.033836006506247,
"acc_norm": 0.5952085142108857,
"acc_norm_stderr": 0.03381505162025332,
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836882,
"mc2": 0.5248739406679576,
"mc2_stderr": 0.01580828169375083
},
"harness|arc:challenge|25": {
"acc": 0.6032423208191127,
"acc_stderr": 0.014296513020180647,
"acc_norm": 0.6254266211604096,
"acc_norm_stderr": 0.014144193471893456
},
"harness|hellaswag|10": {
"acc": 0.6334395538737303,
"acc_stderr": 0.004808802114592843,
"acc_norm": 0.8327026488747261,
"acc_norm_stderr": 0.003724783389253327
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6,
"acc_stderr": 0.030151134457776285,
"acc_norm": 0.6,
"acc_norm_stderr": 0.030151134457776285
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.039420826399272135,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.039420826399272135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.04598188057816541,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.04598188057816541
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159788,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6935483870967742,
"acc_stderr": 0.026226485652553883,
"acc_norm": 0.6935483870967742,
"acc_norm_stderr": 0.026226485652553883
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.034819048444388045,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.034819048444388045
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.02614848346915332,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.02614848346915332
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.02493931390694079,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.02493931390694079
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815642,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815642
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7908256880733945,
"acc_stderr": 0.01743793717334323,
"acc_norm": 0.7908256880733945,
"acc_norm_stderr": 0.01743793717334323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.02574490253229093,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.02574490253229093
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.031811497470553604,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.031811497470553604
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070415,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070415
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854934,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077785,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077785
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7701149425287356,
"acc_stderr": 0.01504630184669182,
"acc_norm": 0.7701149425287356,
"acc_norm_stderr": 0.01504630184669182
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.025522474632121612,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.025522474632121612
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3776536312849162,
"acc_stderr": 0.01621414875213663,
"acc_norm": 0.3776536312849162,
"acc_norm_stderr": 0.01621414875213663
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.027121956071388863,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.027121956071388863
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.025630824975621344,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.025630824975621344
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4452411994784876,
"acc_stderr": 0.012693421303973294,
"acc_norm": 0.4452411994784876,
"acc_norm_stderr": 0.012693421303973294
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.029722152099280065,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.029722152099280065
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.019373332420724493,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.019373332420724493
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6693877551020408,
"acc_stderr": 0.0301164262965406,
"acc_norm": 0.6693877551020408,
"acc_norm_stderr": 0.0301164262965406
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836882,
"mc2": 0.5248739406679576,
"mc2_stderr": 0.01580828169375083
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TaylorAI__FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model | 2023-08-27T12:44:21.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TaylorAI/FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TaylorAI/FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model](https://huggingface.co/TaylorAI/FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TaylorAI__FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-24T06:39:02.499923](https://huggingface.co/datasets/open-llm-leaderboard/details_TaylorAI__FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model/blob/main/results_2023-08-24T06%3A39%3A02.499923.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4778725300418216,\n\
\ \"acc_stderr\": 0.035170340381620896,\n \"acc_norm\": 0.48197441475401676,\n\
\ \"acc_norm_stderr\": 0.03515576761540331,\n \"mc1\": 0.2533659730722154,\n\
\ \"mc1_stderr\": 0.015225899340826842,\n \"mc2\": 0.3713852731214326,\n\
\ \"mc2_stderr\": 0.01349724997044833\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.48378839590443684,\n \"acc_stderr\": 0.014603708567414947,\n\
\ \"acc_norm\": 0.5247440273037542,\n \"acc_norm_stderr\": 0.014593487694937736\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5897231627165903,\n\
\ \"acc_stderr\": 0.004908786109095829,\n \"acc_norm\": 0.7907787293367855,\n\
\ \"acc_norm_stderr\": 0.004059213774735545\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40131578947368424,\n \"acc_stderr\": 0.03988903703336284,\n\
\ \"acc_norm\": 0.40131578947368424,\n \"acc_norm_stderr\": 0.03988903703336284\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.47547169811320755,\n \"acc_stderr\": 0.030735822206205608,\n\
\ \"acc_norm\": 0.47547169811320755,\n \"acc_norm_stderr\": 0.030735822206205608\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04181210050035455,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04181210050035455\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.42196531791907516,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39148936170212767,\n \"acc_stderr\": 0.031907012423268113,\n\
\ \"acc_norm\": 0.39148936170212767,\n \"acc_norm_stderr\": 0.031907012423268113\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.040434618619167466,\n\
\ \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.040434618619167466\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.291005291005291,\n \"acc_stderr\": 0.023393826500484865,\n \"\
acc_norm\": 0.291005291005291,\n \"acc_norm_stderr\": 0.023393826500484865\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.0436031486007746,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.0436031486007746\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5483870967741935,\n\
\ \"acc_stderr\": 0.02831050034856839,\n \"acc_norm\": 0.5483870967741935,\n\
\ \"acc_norm_stderr\": 0.02831050034856839\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.32019704433497537,\n \"acc_stderr\": 0.032826493853041504,\n\
\ \"acc_norm\": 0.32019704433497537,\n \"acc_norm_stderr\": 0.032826493853041504\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\"\
: 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.037425970438065864,\n\
\ \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.037425970438065864\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5606060606060606,\n \"acc_stderr\": 0.0353608594752948,\n \"acc_norm\"\
: 0.5606060606060606,\n \"acc_norm_stderr\": 0.0353608594752948\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.694300518134715,\n \"acc_stderr\": 0.033248379397581594,\n\
\ \"acc_norm\": 0.694300518134715,\n \"acc_norm_stderr\": 0.033248379397581594\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.44358974358974357,\n \"acc_stderr\": 0.025189149894764198,\n\
\ \"acc_norm\": 0.44358974358974357,\n \"acc_norm_stderr\": 0.025189149894764198\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.40756302521008403,\n \"acc_stderr\": 0.03191863374478466,\n\
\ \"acc_norm\": 0.40756302521008403,\n \"acc_norm_stderr\": 0.03191863374478466\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6568807339449542,\n \"acc_stderr\": 0.02035477773608604,\n \"\
acc_norm\": 0.6568807339449542,\n \"acc_norm_stderr\": 0.02035477773608604\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.03167468706828979,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.03167468706828979\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6421568627450981,\n \"acc_stderr\": 0.033644872860882975,\n \"\
acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.033644872860882975\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6540084388185654,\n \"acc_stderr\": 0.03096481058878671,\n \
\ \"acc_norm\": 0.6540084388185654,\n \"acc_norm_stderr\": 0.03096481058878671\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.547085201793722,\n\
\ \"acc_stderr\": 0.03340867501923324,\n \"acc_norm\": 0.547085201793722,\n\
\ \"acc_norm_stderr\": 0.03340867501923324\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5038167938931297,\n \"acc_stderr\": 0.043851623256015534,\n\
\ \"acc_norm\": 0.5038167938931297,\n \"acc_norm_stderr\": 0.043851623256015534\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n\
\ \"acc_stderr\": 0.04812917324536824,\n \"acc_norm\": 0.5462962962962963,\n\
\ \"acc_norm_stderr\": 0.04812917324536824\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5766871165644172,\n \"acc_stderr\": 0.03881891213334384,\n\
\ \"acc_norm\": 0.5766871165644172,\n \"acc_norm_stderr\": 0.03881891213334384\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.046355501356099754,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.046355501356099754\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.04825729337356388,\n\
\ \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.04825729337356388\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7307692307692307,\n\
\ \"acc_stderr\": 0.029058588303748845,\n \"acc_norm\": 0.7307692307692307,\n\
\ \"acc_norm_stderr\": 0.029058588303748845\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6475095785440613,\n\
\ \"acc_stderr\": 0.017084150244081376,\n \"acc_norm\": 0.6475095785440613,\n\
\ \"acc_norm_stderr\": 0.017084150244081376\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5115606936416185,\n \"acc_stderr\": 0.026911898686377913,\n\
\ \"acc_norm\": 0.5115606936416185,\n \"acc_norm_stderr\": 0.026911898686377913\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2759776536312849,\n\
\ \"acc_stderr\": 0.014950103002475353,\n \"acc_norm\": 0.2759776536312849,\n\
\ \"acc_norm_stderr\": 0.014950103002475353\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4934640522875817,\n \"acc_stderr\": 0.028627470550556047,\n\
\ \"acc_norm\": 0.4934640522875817,\n \"acc_norm_stderr\": 0.028627470550556047\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n\
\ \"acc_stderr\": 0.027950481494401266,\n \"acc_norm\": 0.5884244372990354,\n\
\ \"acc_norm_stderr\": 0.027950481494401266\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.027701228468542602,\n\
\ \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.027701228468542602\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.375886524822695,\n \"acc_stderr\": 0.028893955412115886,\n \
\ \"acc_norm\": 0.375886524822695,\n \"acc_norm_stderr\": 0.028893955412115886\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.363754889178618,\n\
\ \"acc_stderr\": 0.012286991879902889,\n \"acc_norm\": 0.363754889178618,\n\
\ \"acc_norm_stderr\": 0.012286991879902889\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4820261437908497,\n \"acc_stderr\": 0.020214761037872408,\n \
\ \"acc_norm\": 0.4820261437908497,\n \"acc_norm_stderr\": 0.020214761037872408\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n\
\ \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.5636363636363636,\n\
\ \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.031680911612338825,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.031680911612338825\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6119402985074627,\n\
\ \"acc_stderr\": 0.03445789964362749,\n \"acc_norm\": 0.6119402985074627,\n\
\ \"acc_norm_stderr\": 0.03445789964362749\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n\
\ \"acc_stderr\": 0.03647168523683227,\n \"acc_norm\": 0.3253012048192771,\n\
\ \"acc_norm_stderr\": 0.03647168523683227\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708312,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708312\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2533659730722154,\n\
\ \"mc1_stderr\": 0.015225899340826842,\n \"mc2\": 0.3713852731214326,\n\
\ \"mc2_stderr\": 0.01349724997044833\n }\n}\n```"
repo_url: https://huggingface.co/TaylorAI/FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|arc:challenge|25_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hellaswag|10_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T06:39:02.499923.parquet'
---
# Dataset Card for Evaluation run of TaylorAI/FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TaylorAI/FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TaylorAI/FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model](https://huggingface.co/TaylorAI/FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TaylorAI__FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-24T06:39:02.499923](https://huggingface.co/datasets/open-llm-leaderboard/details_TaylorAI__FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model/blob/main/results_2023-08-24T06%3A39%3A02.499923.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4778725300418216,
"acc_stderr": 0.035170340381620896,
"acc_norm": 0.48197441475401676,
"acc_norm_stderr": 0.03515576761540331,
"mc1": 0.2533659730722154,
"mc1_stderr": 0.015225899340826842,
"mc2": 0.3713852731214326,
"mc2_stderr": 0.01349724997044833
},
"harness|arc:challenge|25": {
"acc": 0.48378839590443684,
"acc_stderr": 0.014603708567414947,
"acc_norm": 0.5247440273037542,
"acc_norm_stderr": 0.014593487694937736
},
"harness|hellaswag|10": {
"acc": 0.5897231627165903,
"acc_stderr": 0.004908786109095829,
"acc_norm": 0.7907787293367855,
"acc_norm_stderr": 0.004059213774735545
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40131578947368424,
"acc_stderr": 0.03988903703336284,
"acc_norm": 0.40131578947368424,
"acc_norm_stderr": 0.03988903703336284
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.47547169811320755,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.47547169811320755,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5,
"acc_stderr": 0.04181210050035455,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04181210050035455
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.42196531791907516,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.42196531791907516,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39148936170212767,
"acc_stderr": 0.031907012423268113,
"acc_norm": 0.39148936170212767,
"acc_norm_stderr": 0.031907012423268113
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.040434618619167466,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.040434618619167466
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.291005291005291,
"acc_stderr": 0.023393826500484865,
"acc_norm": 0.291005291005291,
"acc_norm_stderr": 0.023393826500484865
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.0436031486007746,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.0436031486007746
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5483870967741935,
"acc_stderr": 0.02831050034856839,
"acc_norm": 0.5483870967741935,
"acc_norm_stderr": 0.02831050034856839
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.32019704433497537,
"acc_stderr": 0.032826493853041504,
"acc_norm": 0.32019704433497537,
"acc_norm_stderr": 0.032826493853041504
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.037425970438065864,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.037425970438065864
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5606060606060606,
"acc_stderr": 0.0353608594752948,
"acc_norm": 0.5606060606060606,
"acc_norm_stderr": 0.0353608594752948
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.694300518134715,
"acc_stderr": 0.033248379397581594,
"acc_norm": 0.694300518134715,
"acc_norm_stderr": 0.033248379397581594
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44358974358974357,
"acc_stderr": 0.025189149894764198,
"acc_norm": 0.44358974358974357,
"acc_norm_stderr": 0.025189149894764198
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.40756302521008403,
"acc_stderr": 0.03191863374478466,
"acc_norm": 0.40756302521008403,
"acc_norm_stderr": 0.03191863374478466
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6568807339449542,
"acc_stderr": 0.02035477773608604,
"acc_norm": 0.6568807339449542,
"acc_norm_stderr": 0.02035477773608604
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.03167468706828979,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.03167468706828979
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.033644872860882975,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.033644872860882975
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6540084388185654,
"acc_stderr": 0.03096481058878671,
"acc_norm": 0.6540084388185654,
"acc_norm_stderr": 0.03096481058878671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.547085201793722,
"acc_stderr": 0.03340867501923324,
"acc_norm": 0.547085201793722,
"acc_norm_stderr": 0.03340867501923324
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5038167938931297,
"acc_stderr": 0.043851623256015534,
"acc_norm": 0.5038167938931297,
"acc_norm_stderr": 0.043851623256015534
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536824,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536824
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5766871165644172,
"acc_stderr": 0.03881891213334384,
"acc_norm": 0.5766871165644172,
"acc_norm_stderr": 0.03881891213334384
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.046355501356099754,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.046355501356099754
},
"harness|hendrycksTest-management|5": {
"acc": 0.6116504854368932,
"acc_stderr": 0.04825729337356388,
"acc_norm": 0.6116504854368932,
"acc_norm_stderr": 0.04825729337356388
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7307692307692307,
"acc_stderr": 0.029058588303748845,
"acc_norm": 0.7307692307692307,
"acc_norm_stderr": 0.029058588303748845
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6475095785440613,
"acc_stderr": 0.017084150244081376,
"acc_norm": 0.6475095785440613,
"acc_norm_stderr": 0.017084150244081376
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5115606936416185,
"acc_stderr": 0.026911898686377913,
"acc_norm": 0.5115606936416185,
"acc_norm_stderr": 0.026911898686377913
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2759776536312849,
"acc_stderr": 0.014950103002475353,
"acc_norm": 0.2759776536312849,
"acc_norm_stderr": 0.014950103002475353
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4934640522875817,
"acc_stderr": 0.028627470550556047,
"acc_norm": 0.4934640522875817,
"acc_norm_stderr": 0.028627470550556047
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5884244372990354,
"acc_stderr": 0.027950481494401266,
"acc_norm": 0.5884244372990354,
"acc_norm_stderr": 0.027950481494401266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.027701228468542602,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.027701228468542602
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.375886524822695,
"acc_stderr": 0.028893955412115886,
"acc_norm": 0.375886524822695,
"acc_norm_stderr": 0.028893955412115886
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.363754889178618,
"acc_stderr": 0.012286991879902889,
"acc_norm": 0.363754889178618,
"acc_norm_stderr": 0.012286991879902889
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4820261437908497,
"acc_stderr": 0.020214761037872408,
"acc_norm": 0.4820261437908497,
"acc_norm_stderr": 0.020214761037872408
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.031680911612338825,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.031680911612338825
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6119402985074627,
"acc_stderr": 0.03445789964362749,
"acc_norm": 0.6119402985074627,
"acc_norm_stderr": 0.03445789964362749
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.03647168523683227,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.03647168523683227
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03565079670708312,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03565079670708312
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2533659730722154,
"mc1_stderr": 0.015225899340826842,
"mc2": 0.3713852731214326,
"mc2_stderr": 0.01349724997044833
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_garage-bAInd__Platypus-30B | 2023-08-27T12:44:23.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of garage-bAInd/Platypus-30B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [garage-bAInd/Platypus-30B](https://huggingface.co/garage-bAInd/Platypus-30B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_garage-bAInd__Platypus-30B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-22T14:57:43.849594](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Platypus-30B/blob/main/results_2023-08-22T14%3A57%3A43.849594.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6416852503924539,\n\
\ \"acc_stderr\": 0.03284694843026895,\n \"acc_norm\": 0.6457464342947101,\n\
\ \"acc_norm_stderr\": 0.03282276652999888,\n \"mc1\": 0.28886168910648713,\n\
\ \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": 0.45350167421108456,\n\
\ \"mc2_stderr\": 0.014301647165211333\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6126279863481229,\n \"acc_stderr\": 0.014235872487909865,\n\
\ \"acc_norm\": 0.6459044368600683,\n \"acc_norm_stderr\": 0.013975454122756558\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6362278430591516,\n\
\ \"acc_stderr\": 0.00480100965769044,\n \"acc_norm\": 0.8425612427803226,\n\
\ \"acc_norm_stderr\": 0.003634695906909659\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.5407407407407407,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6377358490566037,\n \"acc_stderr\": 0.029582245128384303,\n\
\ \"acc_norm\": 0.6377358490566037,\n \"acc_norm_stderr\": 0.029582245128384303\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.03745554791462457,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.03745554791462457\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.03772446857518026,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.03772446857518026\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n\
\ \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.7516129032258064,\n\
\ \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421296,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.045126085985421296\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.03123475237772117,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.03123475237772117\n },\n\
\ \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8232323232323232,\n\
\ \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.8232323232323232,\n\
\ \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.024433016466052455,\n\
\ \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.024433016466052455\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.0291857149498574,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.0291857149498574\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.029472485833136084,\n\
\ \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.029472485833136084\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669235,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669235\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156861,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156861\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8438818565400844,\n \"acc_stderr\": 0.023627159460318688,\n \
\ \"acc_norm\": 0.8438818565400844,\n \"acc_norm_stderr\": 0.023627159460318688\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n\
\ \"acc_stderr\": 0.03021683101150878,\n \"acc_norm\": 0.7174887892376681,\n\
\ \"acc_norm_stderr\": 0.03021683101150878\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"\
acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.04742762361243011,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.04742762361243011\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8020434227330779,\n\
\ \"acc_stderr\": 0.014248873549217589,\n \"acc_norm\": 0.8020434227330779,\n\
\ \"acc_norm_stderr\": 0.014248873549217589\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.49162011173184356,\n\
\ \"acc_stderr\": 0.016720152794672552,\n \"acc_norm\": 0.49162011173184356,\n\
\ \"acc_norm_stderr\": 0.016720152794672552\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.026256053835718964,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.026256053835718964\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.0254038329781796,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.0254038329781796\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5280312907431551,\n\
\ \"acc_stderr\": 0.012750151802922442,\n \"acc_norm\": 0.5280312907431551,\n\
\ \"acc_norm_stderr\": 0.012750151802922442\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.02976826352893311,\n\
\ \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.02976826352893311\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6895424836601307,\n \"acc_stderr\": 0.018718067052623223,\n \
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.018718067052623223\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7714285714285715,\n \"acc_stderr\": 0.026882144922307744,\n\
\ \"acc_norm\": 0.7714285714285715,\n \"acc_norm_stderr\": 0.026882144922307744\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776348,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776348\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28886168910648713,\n\
\ \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": 0.45350167421108456,\n\
\ \"mc2_stderr\": 0.014301647165211333\n }\n}\n```"
repo_url: https://huggingface.co/garage-bAInd/Platypus-30B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|arc:challenge|25_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hellaswag|10_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T14:57:43.849594.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T14:57:43.849594.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T14_57_43.849594
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T14:57:43.849594.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T14:57:43.849594.parquet'
---
# Dataset Card for Evaluation run of garage-bAInd/Platypus-30B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/garage-bAInd/Platypus-30B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [garage-bAInd/Platypus-30B](https://huggingface.co/garage-bAInd/Platypus-30B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_garage-bAInd__Platypus-30B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-22T14:57:43.849594](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Platypus-30B/blob/main/results_2023-08-22T14%3A57%3A43.849594.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6416852503924539,
"acc_stderr": 0.03284694843026895,
"acc_norm": 0.6457464342947101,
"acc_norm_stderr": 0.03282276652999888,
"mc1": 0.28886168910648713,
"mc1_stderr": 0.01586634640138431,
"mc2": 0.45350167421108456,
"mc2_stderr": 0.014301647165211333
},
"harness|arc:challenge|25": {
"acc": 0.6126279863481229,
"acc_stderr": 0.014235872487909865,
"acc_norm": 0.6459044368600683,
"acc_norm_stderr": 0.013975454122756558
},
"harness|hellaswag|10": {
"acc": 0.6362278430591516,
"acc_stderr": 0.00480100965769044,
"acc_norm": 0.8425612427803226,
"acc_norm_stderr": 0.003634695906909659
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6377358490566037,
"acc_stderr": 0.029582245128384303,
"acc_norm": 0.6377358490566037,
"acc_norm_stderr": 0.029582245128384303
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03745554791462457,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03745554791462457
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.03772446857518026,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.03772446857518026
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.03123475237772117,
"acc_norm": 0.8,
"acc_norm_stderr": 0.03123475237772117
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8232323232323232,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.8232323232323232,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.024433016466052455,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.024433016466052455
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.0291857149498574,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.0291857149498574
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7100840336134454,
"acc_stderr": 0.029472485833136084,
"acc_norm": 0.7100840336134454,
"acc_norm_stderr": 0.029472485833136084
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669235,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156861,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156861
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8438818565400844,
"acc_stderr": 0.023627159460318688,
"acc_norm": 0.8438818565400844,
"acc_norm_stderr": 0.023627159460318688
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.03021683101150878,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.03021683101150878
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.04742762361243011,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.04742762361243011
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128137,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8020434227330779,
"acc_stderr": 0.014248873549217589,
"acc_norm": 0.8020434227330779,
"acc_norm_stderr": 0.014248873549217589
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.49162011173184356,
"acc_stderr": 0.016720152794672552,
"acc_norm": 0.49162011173184356,
"acc_norm_stderr": 0.016720152794672552
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.026256053835718964,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.026256053835718964
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.0254038329781796,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.0254038329781796
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5280312907431551,
"acc_stderr": 0.012750151802922442,
"acc_norm": 0.5280312907431551,
"acc_norm_stderr": 0.012750151802922442
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5992647058823529,
"acc_stderr": 0.02976826352893311,
"acc_norm": 0.5992647058823529,
"acc_norm_stderr": 0.02976826352893311
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.018718067052623223,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.018718067052623223
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7714285714285715,
"acc_stderr": 0.026882144922307744,
"acc_norm": 0.7714285714285715,
"acc_norm_stderr": 0.026882144922307744
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776348,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776348
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28886168910648713,
"mc1_stderr": 0.01586634640138431,
"mc2": 0.45350167421108456,
"mc2_stderr": 0.014301647165211333
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Fredithefish__Guanaco-3B-Uncensored | 2023-08-27T12:44:27.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Fredithefish/Guanaco-3B-Uncensored
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Fredithefish/Guanaco-3B-Uncensored](https://huggingface.co/Fredithefish/Guanaco-3B-Uncensored)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Fredithefish__Guanaco-3B-Uncensored\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-24T07:19:31.389190](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__Guanaco-3B-Uncensored/blob/main/results_2023-08-24T07%3A19%3A31.389190.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26204734136891855,\n\
\ \"acc_stderr\": 0.03175680450173633,\n \"acc_norm\": 0.2654098025916834,\n\
\ \"acc_norm_stderr\": 0.031754213084426136,\n \"mc1\": 0.22276621787025705,\n\
\ \"mc1_stderr\": 0.014566506961396723,\n \"mc2\": 0.3471127762261011,\n\
\ \"mc2_stderr\": 0.013525387427355607\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.39761092150170646,\n \"acc_stderr\": 0.01430175222327953,\n\
\ \"acc_norm\": 0.4249146757679181,\n \"acc_norm_stderr\": 0.014445698968520769\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.49880501892053375,\n\
\ \"acc_stderr\": 0.004989767160811357,\n \"acc_norm\": 0.6698864767974507,\n\
\ \"acc_norm_stderr\": 0.004692926794268451\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2814814814814815,\n\
\ \"acc_stderr\": 0.03885004245800254,\n \"acc_norm\": 0.2814814814814815,\n\
\ \"acc_norm_stderr\": 0.03885004245800254\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03523807393012047,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03523807393012047\n \
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.18,\n\
\ \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n \
\ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2490566037735849,\n \"acc_stderr\": 0.026616482980501708,\n\
\ \"acc_norm\": 0.2490566037735849,\n \"acc_norm_stderr\": 0.026616482980501708\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\
\ \"acc_stderr\": 0.035146974678623884,\n \"acc_norm\": 0.22916666666666666,\n\
\ \"acc_norm_stderr\": 0.035146974678623884\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2978723404255319,\n \"acc_stderr\": 0.029896145682095462,\n\
\ \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.029896145682095462\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.03752833958003336,\n\
\ \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.03752833958003336\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525214,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525214\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604673,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604673\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24516129032258063,\n\
\ \"acc_stderr\": 0.024472243840895535,\n \"acc_norm\": 0.24516129032258063,\n\
\ \"acc_norm_stderr\": 0.024472243840895535\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2561576354679803,\n \"acc_stderr\": 0.0307127300709826,\n\
\ \"acc_norm\": 0.2561576354679803,\n \"acc_norm_stderr\": 0.0307127300709826\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.03427743175816524,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.03427743175816524\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.31313131313131315,\n \"acc_stderr\": 0.033042050878136525,\n \"\
acc_norm\": 0.31313131313131315,\n \"acc_norm_stderr\": 0.033042050878136525\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2694300518134715,\n \"acc_stderr\": 0.03201867122877794,\n\
\ \"acc_norm\": 0.2694300518134715,\n \"acc_norm_stderr\": 0.03201867122877794\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.22564102564102564,\n \"acc_stderr\": 0.021193632525148533,\n\
\ \"acc_norm\": 0.22564102564102564,\n \"acc_norm_stderr\": 0.021193632525148533\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145665,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145665\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.20168067226890757,\n \"acc_stderr\": 0.026064313406304534,\n\
\ \"acc_norm\": 0.20168067226890757,\n \"acc_norm_stderr\": 0.026064313406304534\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.24403669724770644,\n\
\ \"acc_stderr\": 0.018415286351416423,\n \"acc_norm\": 0.24403669724770644,\n\
\ \"acc_norm_stderr\": 0.018415286351416423\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.029157522184605603,\n\
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.029157522184605603\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2647058823529412,\n \"acc_stderr\": 0.030964517926923403,\n \"\
acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.030964517926923403\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658335,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658335\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.21524663677130046,\n\
\ \"acc_stderr\": 0.027584066602208263,\n \"acc_norm\": 0.21524663677130046,\n\
\ \"acc_norm_stderr\": 0.027584066602208263\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.183206106870229,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.183206106870229,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.34710743801652894,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.34710743801652894,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.19642857142857142,\n\
\ \"acc_stderr\": 0.03770970049347019,\n \"acc_norm\": 0.19642857142857142,\n\
\ \"acc_norm_stderr\": 0.03770970049347019\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.04453254836326469,\n\
\ \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.04453254836326469\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\
\ \"acc_stderr\": 0.028605953702004243,\n \"acc_norm\": 0.2564102564102564,\n\
\ \"acc_norm_stderr\": 0.028605953702004243\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.25287356321839083,\n\
\ \"acc_stderr\": 0.015543377313719681,\n \"acc_norm\": 0.25287356321839083,\n\
\ \"acc_norm_stderr\": 0.015543377313719681\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2745664739884393,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.2745664739884393,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23016759776536314,\n\
\ \"acc_stderr\": 0.014078339253425826,\n \"acc_norm\": 0.23016759776536314,\n\
\ \"acc_norm_stderr\": 0.014078339253425826\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.27124183006535946,\n \"acc_stderr\": 0.025457756696667888,\n\
\ \"acc_norm\": 0.27124183006535946,\n \"acc_norm_stderr\": 0.025457756696667888\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.02465968518596728,\n\
\ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.02465968518596728\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843014,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843014\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27249022164276404,\n\
\ \"acc_stderr\": 0.01137165829431153,\n \"acc_norm\": 0.27249022164276404,\n\
\ \"acc_norm_stderr\": 0.01137165829431153\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.16176470588235295,\n \"acc_stderr\": 0.02236867256288675,\n\
\ \"acc_norm\": 0.16176470588235295,\n \"acc_norm_stderr\": 0.02236867256288675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26143790849673204,\n \"acc_stderr\": 0.017776947157528044,\n \
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.017776947157528044\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3224489795918367,\n \"acc_stderr\": 0.029923100563683906,\n\
\ \"acc_norm\": 0.3224489795918367,\n \"acc_norm_stderr\": 0.029923100563683906\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n\
\ \"acc_stderr\": 0.03076944496729601,\n \"acc_norm\": 0.2537313432835821,\n\
\ \"acc_norm_stderr\": 0.03076944496729601\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.23493975903614459,\n\
\ \"acc_stderr\": 0.03300533186128922,\n \"acc_norm\": 0.23493975903614459,\n\
\ \"acc_norm_stderr\": 0.03300533186128922\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.27485380116959063,\n \"acc_stderr\": 0.03424042924691583,\n\
\ \"acc_norm\": 0.27485380116959063,\n \"acc_norm_stderr\": 0.03424042924691583\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22276621787025705,\n\
\ \"mc1_stderr\": 0.014566506961396723,\n \"mc2\": 0.3471127762261011,\n\
\ \"mc2_stderr\": 0.013525387427355607\n }\n}\n```"
repo_url: https://huggingface.co/Fredithefish/Guanaco-3B-Uncensored
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|arc:challenge|25_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hellaswag|10_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T07:19:31.389190.parquet'
---
# Dataset Card for Evaluation run of Fredithefish/Guanaco-3B-Uncensored
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Fredithefish/Guanaco-3B-Uncensored
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Fredithefish/Guanaco-3B-Uncensored](https://huggingface.co/Fredithefish/Guanaco-3B-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Fredithefish__Guanaco-3B-Uncensored",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-24T07:19:31.389190](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__Guanaco-3B-Uncensored/blob/main/results_2023-08-24T07%3A19%3A31.389190.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26204734136891855,
"acc_stderr": 0.03175680450173633,
"acc_norm": 0.2654098025916834,
"acc_norm_stderr": 0.031754213084426136,
"mc1": 0.22276621787025705,
"mc1_stderr": 0.014566506961396723,
"mc2": 0.3471127762261011,
"mc2_stderr": 0.013525387427355607
},
"harness|arc:challenge|25": {
"acc": 0.39761092150170646,
"acc_stderr": 0.01430175222327953,
"acc_norm": 0.4249146757679181,
"acc_norm_stderr": 0.014445698968520769
},
"harness|hellaswag|10": {
"acc": 0.49880501892053375,
"acc_stderr": 0.004989767160811357,
"acc_norm": 0.6698864767974507,
"acc_norm_stderr": 0.004692926794268451
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.03885004245800254,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.03885004245800254
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.25,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2490566037735849,
"acc_stderr": 0.026616482980501708,
"acc_norm": 0.2490566037735849,
"acc_norm_stderr": 0.026616482980501708
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.035146974678623884,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.035146974678623884
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2978723404255319,
"acc_stderr": 0.029896145682095462,
"acc_norm": 0.2978723404255319,
"acc_norm_stderr": 0.029896145682095462
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.03752833958003336,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.03752833958003336
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525214,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604673,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604673
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24516129032258063,
"acc_stderr": 0.024472243840895535,
"acc_norm": 0.24516129032258063,
"acc_norm_stderr": 0.024472243840895535
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2561576354679803,
"acc_stderr": 0.0307127300709826,
"acc_norm": 0.2561576354679803,
"acc_norm_stderr": 0.0307127300709826
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.03427743175816524,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.03427743175816524
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.31313131313131315,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.31313131313131315,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2694300518134715,
"acc_stderr": 0.03201867122877794,
"acc_norm": 0.2694300518134715,
"acc_norm_stderr": 0.03201867122877794
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.22564102564102564,
"acc_stderr": 0.021193632525148533,
"acc_norm": 0.22564102564102564,
"acc_norm_stderr": 0.021193632525148533
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145665,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145665
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.20168067226890757,
"acc_stderr": 0.026064313406304534,
"acc_norm": 0.20168067226890757,
"acc_norm_stderr": 0.026064313406304534
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24403669724770644,
"acc_stderr": 0.018415286351416423,
"acc_norm": 0.24403669724770644,
"acc_norm_stderr": 0.018415286351416423
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.029157522184605603,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.029157522184605603
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.030964517926923403,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.030964517926923403
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658335,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658335
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.21524663677130046,
"acc_stderr": 0.027584066602208263,
"acc_norm": 0.21524663677130046,
"acc_norm_stderr": 0.027584066602208263
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.183206106870229,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.183206106870229,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.34710743801652894,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.34710743801652894,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25766871165644173,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.25766871165644173,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.19642857142857142,
"acc_stderr": 0.03770970049347019,
"acc_norm": 0.19642857142857142,
"acc_norm_stderr": 0.03770970049347019
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.04453254836326469,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.04453254836326469
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.028605953702004243,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.028605953702004243
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.25287356321839083,
"acc_stderr": 0.015543377313719681,
"acc_norm": 0.25287356321839083,
"acc_norm_stderr": 0.015543377313719681
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2745664739884393,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.2745664739884393,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23016759776536314,
"acc_stderr": 0.014078339253425826,
"acc_norm": 0.23016759776536314,
"acc_norm_stderr": 0.014078339253425826
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27124183006535946,
"acc_stderr": 0.025457756696667888,
"acc_norm": 0.27124183006535946,
"acc_norm_stderr": 0.025457756696667888
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.026003301117885142,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.026003301117885142
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.02465968518596728,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.02465968518596728
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843014,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843014
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27249022164276404,
"acc_stderr": 0.01137165829431153,
"acc_norm": 0.27249022164276404,
"acc_norm_stderr": 0.01137165829431153
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16176470588235295,
"acc_stderr": 0.02236867256288675,
"acc_norm": 0.16176470588235295,
"acc_norm_stderr": 0.02236867256288675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.017776947157528044,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.017776947157528044
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3224489795918367,
"acc_stderr": 0.029923100563683906,
"acc_norm": 0.3224489795918367,
"acc_norm_stderr": 0.029923100563683906
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.03076944496729601,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.03076944496729601
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-virology|5": {
"acc": 0.23493975903614459,
"acc_stderr": 0.03300533186128922,
"acc_norm": 0.23493975903614459,
"acc_norm_stderr": 0.03300533186128922
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.27485380116959063,
"acc_stderr": 0.03424042924691583,
"acc_norm": 0.27485380116959063,
"acc_norm_stderr": 0.03424042924691583
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22276621787025705,
"mc1_stderr": 0.014566506961396723,
"mc2": 0.3471127762261011,
"mc2_stderr": 0.013525387427355607
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TehVenom__oasst-sft-6-llama-33b-xor-MERGED-16bit | 2023-08-27T12:44:29.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TehVenom/oasst-sft-6-llama-33b-xor-MERGED-16bit
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TehVenom/oasst-sft-6-llama-33b-xor-MERGED-16bit](https://huggingface.co/TehVenom/oasst-sft-6-llama-33b-xor-MERGED-16bit)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TehVenom__oasst-sft-6-llama-33b-xor-MERGED-16bit\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-24T10:25:39.689154](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__oasst-sft-6-llama-33b-xor-MERGED-16bit/blob/main/results_2023-08-24T10%3A25%3A39.689154.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5755546412978934,\n\
\ \"acc_stderr\": 0.034446385582587884,\n \"acc_norm\": 0.5794542921505563,\n\
\ \"acc_norm_stderr\": 0.034424666814937256,\n \"mc1\": 0.33659730722154224,\n\
\ \"mc1_stderr\": 0.016542412809494884,\n \"mc2\": 0.5069548720358209,\n\
\ \"mc2_stderr\": 0.015000559416370851\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5844709897610921,\n \"acc_stderr\": 0.014401366641216384,\n\
\ \"acc_norm\": 0.6151877133105802,\n \"acc_norm_stderr\": 0.014218371065251102\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6356303525194185,\n\
\ \"acc_stderr\": 0.004802694106203651,\n \"acc_norm\": 0.8349930292770364,\n\
\ \"acc_norm_stderr\": 0.003704282390781718\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.04033565667848319,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.04033565667848319\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.030285009259009798,\n\
\ \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.030285009259009798\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6319444444444444,\n\
\ \"acc_stderr\": 0.040329990539607195,\n \"acc_norm\": 0.6319444444444444,\n\
\ \"acc_norm_stderr\": 0.040329990539607195\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n\
\ \"acc_stderr\": 0.03801685104524458,\n \"acc_norm\": 0.5375722543352601,\n\
\ \"acc_norm_stderr\": 0.03801685104524458\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663434,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663434\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.032650194750335815,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.032650194750335815\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35714285714285715,\n \"acc_stderr\": 0.02467786284133278,\n \"\
acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.02467786284133278\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6483870967741936,\n \"acc_stderr\": 0.027162537826948458,\n \"\
acc_norm\": 0.6483870967741936,\n \"acc_norm_stderr\": 0.027162537826948458\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.39408866995073893,\n \"acc_stderr\": 0.03438157967036543,\n \"\
acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.03438157967036543\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624337,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624337\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836557,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836557\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5666666666666667,\n \"acc_stderr\": 0.025124653525885117,\n\
\ \"acc_norm\": 0.5666666666666667,\n \"acc_norm_stderr\": 0.025124653525885117\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7779816513761468,\n \"acc_stderr\": 0.017818849564796655,\n \"\
acc_norm\": 0.7779816513761468,\n \"acc_norm_stderr\": 0.017818849564796655\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.02977177522814563,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02977177522814563\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906943,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906943\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.04243869242230524,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.04243869242230524\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04557239513497752,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04557239513497752\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.03731133519673893,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.03731133519673893\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503948,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503948\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7496807151979565,\n\
\ \"acc_stderr\": 0.015491088951494581,\n \"acc_norm\": 0.7496807151979565,\n\
\ \"acc_norm_stderr\": 0.015491088951494581\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.025574123786546665,\n\
\ \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.025574123786546665\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4033519553072626,\n\
\ \"acc_stderr\": 0.01640712303219525,\n \"acc_norm\": 0.4033519553072626,\n\
\ \"acc_norm_stderr\": 0.01640712303219525\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.02799672318063144,\n\
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.02799672318063144\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.02666441088693762,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.02666441088693762\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6358024691358025,\n \"acc_stderr\": 0.026774929899722324,\n\
\ \"acc_norm\": 0.6358024691358025,\n \"acc_norm_stderr\": 0.026774929899722324\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41843971631205673,\n \"acc_stderr\": 0.02942799403941999,\n \
\ \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.02942799403941999\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4335071707953064,\n\
\ \"acc_stderr\": 0.012656810383983967,\n \"acc_norm\": 0.4335071707953064,\n\
\ \"acc_norm_stderr\": 0.012656810383983967\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5735294117647058,\n \"acc_stderr\": 0.03004261583271487,\n\
\ \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.03004261583271487\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5882352941176471,\n \"acc_stderr\": 0.019910377463105932,\n \
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.019910377463105932\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030806,\n\
\ \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030806\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.7611940298507462,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33659730722154224,\n\
\ \"mc1_stderr\": 0.016542412809494884,\n \"mc2\": 0.5069548720358209,\n\
\ \"mc2_stderr\": 0.015000559416370851\n }\n}\n```"
repo_url: https://huggingface.co/TehVenom/oasst-sft-6-llama-33b-xor-MERGED-16bit
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|arc:challenge|25_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hellaswag|10_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T10:25:39.689154.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T10:25:39.689154.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_24T10_25_39.689154
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T10:25:39.689154.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T10:25:39.689154.parquet'
---
# Dataset Card for Evaluation run of TehVenom/oasst-sft-6-llama-33b-xor-MERGED-16bit
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TehVenom/oasst-sft-6-llama-33b-xor-MERGED-16bit
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TehVenom/oasst-sft-6-llama-33b-xor-MERGED-16bit](https://huggingface.co/TehVenom/oasst-sft-6-llama-33b-xor-MERGED-16bit) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TehVenom__oasst-sft-6-llama-33b-xor-MERGED-16bit",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-24T10:25:39.689154](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__oasst-sft-6-llama-33b-xor-MERGED-16bit/blob/main/results_2023-08-24T10%3A25%3A39.689154.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5755546412978934,
"acc_stderr": 0.034446385582587884,
"acc_norm": 0.5794542921505563,
"acc_norm_stderr": 0.034424666814937256,
"mc1": 0.33659730722154224,
"mc1_stderr": 0.016542412809494884,
"mc2": 0.5069548720358209,
"mc2_stderr": 0.015000559416370851
},
"harness|arc:challenge|25": {
"acc": 0.5844709897610921,
"acc_stderr": 0.014401366641216384,
"acc_norm": 0.6151877133105802,
"acc_norm_stderr": 0.014218371065251102
},
"harness|hellaswag|10": {
"acc": 0.6356303525194185,
"acc_stderr": 0.004802694106203651,
"acc_norm": 0.8349930292770364,
"acc_norm_stderr": 0.003704282390781718
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.04033565667848319,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.04033565667848319
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5886792452830188,
"acc_stderr": 0.030285009259009798,
"acc_norm": 0.5886792452830188,
"acc_norm_stderr": 0.030285009259009798
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6319444444444444,
"acc_stderr": 0.040329990539607195,
"acc_norm": 0.6319444444444444,
"acc_norm_stderr": 0.040329990539607195
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.03801685104524458,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.03801685104524458
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663434,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663434
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.02467786284133278,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.02467786284133278
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6483870967741936,
"acc_stderr": 0.027162537826948458,
"acc_norm": 0.6483870967741936,
"acc_norm_stderr": 0.027162537826948458
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39408866995073893,
"acc_stderr": 0.03438157967036543,
"acc_norm": 0.39408866995073893,
"acc_norm_stderr": 0.03438157967036543
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624337,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624337
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836557,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836557
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5666666666666667,
"acc_stderr": 0.025124653525885117,
"acc_norm": 0.5666666666666667,
"acc_norm_stderr": 0.025124653525885117
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7779816513761468,
"acc_stderr": 0.017818849564796655,
"acc_norm": 0.7779816513761468,
"acc_norm_stderr": 0.017818849564796655
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.02977177522814563,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.02977177522814563
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906943,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906943
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.04243869242230524,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.04243869242230524
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497752,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497752
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.03731133519673893,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.03731133519673893
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503948,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503948
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7496807151979565,
"acc_stderr": 0.015491088951494581,
"acc_norm": 0.7496807151979565,
"acc_norm_stderr": 0.015491088951494581
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6560693641618497,
"acc_stderr": 0.025574123786546665,
"acc_norm": 0.6560693641618497,
"acc_norm_stderr": 0.025574123786546665
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4033519553072626,
"acc_stderr": 0.01640712303219525,
"acc_norm": 0.4033519553072626,
"acc_norm_stderr": 0.01640712303219525
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.02799672318063144,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.02799672318063144
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.02666441088693762,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.02666441088693762
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6358024691358025,
"acc_stderr": 0.026774929899722324,
"acc_norm": 0.6358024691358025,
"acc_norm_stderr": 0.026774929899722324
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41843971631205673,
"acc_stderr": 0.02942799403941999,
"acc_norm": 0.41843971631205673,
"acc_norm_stderr": 0.02942799403941999
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4335071707953064,
"acc_stderr": 0.012656810383983967,
"acc_norm": 0.4335071707953064,
"acc_norm_stderr": 0.012656810383983967
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.03004261583271487,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.03004261583271487
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.019910377463105932,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.019910377463105932
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030806,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030806
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7611940298507462,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.7611940298507462,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33659730722154224,
"mc1_stderr": 0.016542412809494884,
"mc2": 0.5069548720358209,
"mc2_stderr": 0.015000559416370851
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_bigscience__bloomz-7b1 | 2023-09-22T17:52:42.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of bigscience/bloomz-7b1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bigscience/bloomz-7b1](https://huggingface.co/bigscience/bloomz-7b1) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bigscience__bloomz-7b1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T17:52:30.288263](https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloomz-7b1/blob/main/results_2023-09-22T17-52-30.288263.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.23406040268456377,\n\
\ \"em_stderr\": 0.00433611594363341,\n \"f1\": 0.2680463506711411,\n\
\ \"f1_stderr\": 0.004351989813189547,\n \"acc\": 0.323583494946364,\n\
\ \"acc_stderr\": 0.007097345688161246\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.23406040268456377,\n \"em_stderr\": 0.00433611594363341,\n\
\ \"f1\": 0.2680463506711411,\n \"f1_stderr\": 0.004351989813189547\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \
\ \"acc_stderr\": 0.0007581501137225419\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6464088397790055,\n \"acc_stderr\": 0.01343654126259995\n\
\ }\n}\n```"
repo_url: https://huggingface.co/bigscience/bloomz-7b1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|arc:challenge|25_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|arc:challenge|25_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T17_52_30.288263
path:
- '**/details_harness|drop|3_2023-09-22T17-52-30.288263.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T17-52-30.288263.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T17_52_30.288263
path:
- '**/details_harness|gsm8k|5_2023-09-22T17-52-30.288263.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T17-52-30.288263.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hellaswag|10_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hellaswag|10_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T10:10:08.875186.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T11:29:59.333088.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T10_10_08.875186
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T10:10:08.875186.parquet'
- split: 2023_08_22T11_29_59.333088
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T11:29:59.333088.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T11:29:59.333088.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T17_52_30.288263
path:
- '**/details_harness|winogrande|5_2023-09-22T17-52-30.288263.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T17-52-30.288263.parquet'
- config_name: results
data_files:
- split: 2023_09_22T17_52_30.288263
path:
- results_2023-09-22T17-52-30.288263.parquet
- split: latest
path:
- results_2023-09-22T17-52-30.288263.parquet
---
# Dataset Card for Evaluation run of bigscience/bloomz-7b1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bigscience/bloomz-7b1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [bigscience/bloomz-7b1](https://huggingface.co/bigscience/bloomz-7b1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bigscience__bloomz-7b1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T17:52:30.288263](https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloomz-7b1/blob/main/results_2023-09-22T17-52-30.288263.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.23406040268456377,
"em_stderr": 0.00433611594363341,
"f1": 0.2680463506711411,
"f1_stderr": 0.004351989813189547,
"acc": 0.323583494946364,
"acc_stderr": 0.007097345688161246
},
"harness|drop|3": {
"em": 0.23406040268456377,
"em_stderr": 0.00433611594363341,
"f1": 0.2680463506711411,
"f1_stderr": 0.004351989813189547
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225419
},
"harness|winogrande|5": {
"acc": 0.6464088397790055,
"acc_stderr": 0.01343654126259995
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TFLai__pythia-2.8b-4bit-alpaca | 2023-08-27T12:44:33.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TFLai/pythia-2.8b-4bit-alpaca
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TFLai/pythia-2.8b-4bit-alpaca](https://huggingface.co/TFLai/pythia-2.8b-4bit-alpaca)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__pythia-2.8b-4bit-alpaca\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-22T17:37:58.174329](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__pythia-2.8b-4bit-alpaca/blob/main/results_2023-08-22T17%3A37%3A58.174329.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2594255076233803,\n\
\ \"acc_stderr\": 0.03168859410451261,\n \"acc_norm\": 0.2625217156742845,\n\
\ \"acc_norm_stderr\": 0.03169438618791141,\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570342,\n \"mc2\": 0.3914056867537695,\n\
\ \"mc2_stderr\": 0.013897246654724475\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.310580204778157,\n \"acc_stderr\": 0.013522292098053054,\n\
\ \"acc_norm\": 0.34726962457337884,\n \"acc_norm_stderr\": 0.013913034529620437\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.44363672575184226,\n\
\ \"acc_stderr\": 0.004957976789260528,\n \"acc_norm\": 0.5896235809599681,\n\
\ \"acc_norm_stderr\": 0.00490896727822248\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.035834961763610625,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.035834961763610625\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.025125766484827845,\n\
\ \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.025125766484827845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749864,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749864\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.027678452578212373,\n\
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.027678452578212373\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.03752833958003337,\n\
\ \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.03752833958003337\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2751322751322751,\n \"acc_stderr\": 0.023000086859068642,\n \"\
acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.023000086859068642\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.14285714285714285,\n\
\ \"acc_stderr\": 0.03129843185743808,\n \"acc_norm\": 0.14285714285714285,\n\
\ \"acc_norm_stderr\": 0.03129843185743808\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.25161290322580643,\n \"acc_stderr\": 0.024685979286239956,\n \"\
acc_norm\": 0.25161290322580643,\n \"acc_norm_stderr\": 0.024685979286239956\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15763546798029557,\n \"acc_stderr\": 0.025639014131172404,\n \"\
acc_norm\": 0.15763546798029557,\n \"acc_norm_stderr\": 0.025639014131172404\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.0340150671524904,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.0340150671524904\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2474747474747475,\n \"acc_stderr\": 0.030746300742124488,\n \"\
acc_norm\": 0.2474747474747475,\n \"acc_norm_stderr\": 0.030746300742124488\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.24870466321243523,\n \"acc_stderr\": 0.031195840877700307,\n\
\ \"acc_norm\": 0.24870466321243523,\n \"acc_norm_stderr\": 0.031195840877700307\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.27692307692307694,\n \"acc_stderr\": 0.022688042352424994,\n\
\ \"acc_norm\": 0.27692307692307694,\n \"acc_norm_stderr\": 0.022688042352424994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507383,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507383\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.027205371538279483,\n \
\ \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.027205371538279483\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21100917431192662,\n \"acc_stderr\": 0.017493922404112648,\n \"\
acc_norm\": 0.21100917431192662,\n \"acc_norm_stderr\": 0.017493922404112648\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3194444444444444,\n \"acc_stderr\": 0.03179876342176852,\n \"\
acc_norm\": 0.3194444444444444,\n \"acc_norm_stderr\": 0.03179876342176852\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604264,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604264\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.24472573839662448,\n \"acc_stderr\": 0.02798569938703641,\n \
\ \"acc_norm\": 0.24472573839662448,\n \"acc_norm_stderr\": 0.02798569938703641\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.24663677130044842,\n\
\ \"acc_stderr\": 0.028930413120910888,\n \"acc_norm\": 0.24663677130044842,\n\
\ \"acc_norm_stderr\": 0.028930413120910888\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3305785123966942,\n \"acc_stderr\": 0.04294340845212094,\n \"\
acc_norm\": 0.3305785123966942,\n \"acc_norm_stderr\": 0.04294340845212094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n\
\ \"acc_stderr\": 0.04007341809755805,\n \"acc_norm\": 0.23214285714285715,\n\
\ \"acc_norm_stderr\": 0.04007341809755805\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24358974358974358,\n\
\ \"acc_stderr\": 0.028120966503914407,\n \"acc_norm\": 0.24358974358974358,\n\
\ \"acc_norm_stderr\": 0.028120966503914407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2681992337164751,\n\
\ \"acc_stderr\": 0.015842430835269424,\n \"acc_norm\": 0.2681992337164751,\n\
\ \"acc_norm_stderr\": 0.015842430835269424\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.28901734104046245,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.28901734104046245,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.26143790849673204,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.27009646302250806,\n\
\ \"acc_stderr\": 0.025218040373410612,\n \"acc_norm\": 0.27009646302250806,\n\
\ \"acc_norm_stderr\": 0.025218040373410612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.025407197798890155,\n\
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.025407197798890155\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290396,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290396\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2796610169491525,\n\
\ \"acc_stderr\": 0.011463397393861955,\n \"acc_norm\": 0.2796610169491525,\n\
\ \"acc_norm_stderr\": 0.011463397393861955\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3272058823529412,\n \"acc_stderr\": 0.028501452860396567,\n\
\ \"acc_norm\": 0.3272058823529412,\n \"acc_norm_stderr\": 0.028501452860396567\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.272875816993464,\n \"acc_stderr\": 0.018020474148393577,\n \
\ \"acc_norm\": 0.272875816993464,\n \"acc_norm_stderr\": 0.018020474148393577\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n\
\ \"acc_stderr\": 0.040693063197213775,\n \"acc_norm\": 0.23636363636363636,\n\
\ \"acc_norm_stderr\": 0.040693063197213775\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.22040816326530613,\n \"acc_stderr\": 0.026537045312145294,\n\
\ \"acc_norm\": 0.22040816326530613,\n \"acc_norm_stderr\": 0.026537045312145294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.20481927710843373,\n\
\ \"acc_stderr\": 0.03141784291663925,\n \"acc_norm\": 0.20481927710843373,\n\
\ \"acc_norm_stderr\": 0.03141784291663925\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.034678266857038266,\n\
\ \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.034678266857038266\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570342,\n \"mc2\": 0.3914056867537695,\n\
\ \"mc2_stderr\": 0.013897246654724475\n }\n}\n```"
repo_url: https://huggingface.co/TFLai/pythia-2.8b-4bit-alpaca
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|arc:challenge|25_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hellaswag|10_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T17:37:58.174329.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T17:37:58.174329.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T17_37_58.174329
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T17:37:58.174329.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T17:37:58.174329.parquet'
---
# Dataset Card for Evaluation run of TFLai/pythia-2.8b-4bit-alpaca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/pythia-2.8b-4bit-alpaca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/pythia-2.8b-4bit-alpaca](https://huggingface.co/TFLai/pythia-2.8b-4bit-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__pythia-2.8b-4bit-alpaca",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-22T17:37:58.174329](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__pythia-2.8b-4bit-alpaca/blob/main/results_2023-08-22T17%3A37%3A58.174329.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2594255076233803,
"acc_stderr": 0.03168859410451261,
"acc_norm": 0.2625217156742845,
"acc_norm_stderr": 0.03169438618791141,
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570342,
"mc2": 0.3914056867537695,
"mc2_stderr": 0.013897246654724475
},
"harness|arc:challenge|25": {
"acc": 0.310580204778157,
"acc_stderr": 0.013522292098053054,
"acc_norm": 0.34726962457337884,
"acc_norm_stderr": 0.013913034529620437
},
"harness|hellaswag|10": {
"acc": 0.44363672575184226,
"acc_stderr": 0.004957976789260528,
"acc_norm": 0.5896235809599681,
"acc_norm_stderr": 0.00490896727822248
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.035834961763610625,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.035834961763610625
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21132075471698114,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.21132075471698114,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749864,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749864
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.027678452578212373,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.027678452578212373
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.03752833958003337,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.03752833958003337
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.023000086859068642,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.023000086859068642
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.14285714285714285,
"acc_stderr": 0.03129843185743808,
"acc_norm": 0.14285714285714285,
"acc_norm_stderr": 0.03129843185743808
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.024685979286239956,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.024685979286239956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15763546798029557,
"acc_stderr": 0.025639014131172404,
"acc_norm": 0.15763546798029557,
"acc_norm_stderr": 0.025639014131172404
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.0340150671524904,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.0340150671524904
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2474747474747475,
"acc_stderr": 0.030746300742124488,
"acc_norm": 0.2474747474747475,
"acc_norm_stderr": 0.030746300742124488
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24870466321243523,
"acc_stderr": 0.031195840877700307,
"acc_norm": 0.24870466321243523,
"acc_norm_stderr": 0.031195840877700307
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.27692307692307694,
"acc_stderr": 0.022688042352424994,
"acc_norm": 0.27692307692307694,
"acc_norm_stderr": 0.022688042352424994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507383,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507383
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.226890756302521,
"acc_stderr": 0.027205371538279483,
"acc_norm": 0.226890756302521,
"acc_norm_stderr": 0.027205371538279483
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21100917431192662,
"acc_stderr": 0.017493922404112648,
"acc_norm": 0.21100917431192662,
"acc_norm_stderr": 0.017493922404112648
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3194444444444444,
"acc_stderr": 0.03179876342176852,
"acc_norm": 0.3194444444444444,
"acc_norm_stderr": 0.03179876342176852
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604264,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604264
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.24472573839662448,
"acc_stderr": 0.02798569938703641,
"acc_norm": 0.24472573839662448,
"acc_norm_stderr": 0.02798569938703641
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.24663677130044842,
"acc_stderr": 0.028930413120910888,
"acc_norm": 0.24663677130044842,
"acc_norm_stderr": 0.028930413120910888
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3305785123966942,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.3305785123966942,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2883435582822086,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.2883435582822086,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755805,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755805
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.028120966503914407,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.028120966503914407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2681992337164751,
"acc_stderr": 0.015842430835269424,
"acc_norm": 0.2681992337164751,
"acc_norm_stderr": 0.015842430835269424
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.28901734104046245,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.28901734104046245,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.27009646302250806,
"acc_stderr": 0.025218040373410612,
"acc_norm": 0.27009646302250806,
"acc_norm_stderr": 0.025218040373410612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.025407197798890155,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.025407197798890155
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290396,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290396
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2796610169491525,
"acc_stderr": 0.011463397393861955,
"acc_norm": 0.2796610169491525,
"acc_norm_stderr": 0.011463397393861955
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3272058823529412,
"acc_stderr": 0.028501452860396567,
"acc_norm": 0.3272058823529412,
"acc_norm_stderr": 0.028501452860396567
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.272875816993464,
"acc_stderr": 0.018020474148393577,
"acc_norm": 0.272875816993464,
"acc_norm_stderr": 0.018020474148393577
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.040693063197213775,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.040693063197213775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22040816326530613,
"acc_stderr": 0.026537045312145294,
"acc_norm": 0.22040816326530613,
"acc_norm_stderr": 0.026537045312145294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-virology|5": {
"acc": 0.20481927710843373,
"acc_stderr": 0.03141784291663925,
"acc_norm": 0.20481927710843373,
"acc_norm_stderr": 0.03141784291663925
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.034678266857038266,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.034678266857038266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570342,
"mc2": 0.3914056867537695,
"mc2_stderr": 0.013897246654724475
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Open-Orca__LlongOrca-13B-16k | 2023-09-23T02:40:02.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Open-Orca/LlongOrca-13B-16k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Open-Orca/LlongOrca-13B-16k](https://huggingface.co/Open-Orca/LlongOrca-13B-16k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Open-Orca__LlongOrca-13B-16k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T02:39:50.739204](https://huggingface.co/datasets/open-llm-leaderboard/details_Open-Orca__LlongOrca-13B-16k/blob/main/results_2023-09-23T02-39-50.739204.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.24842701342281878,\n\
\ \"em_stderr\": 0.004425115813837482,\n \"f1\": 0.3159280620805379,\n\
\ \"f1_stderr\": 0.004388510945380163,\n \"acc\": 0.4434148948074197,\n\
\ \"acc_stderr\": 0.010487468726575147\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.24842701342281878,\n \"em_stderr\": 0.004425115813837482,\n\
\ \"f1\": 0.3159280620805379,\n \"f1_stderr\": 0.004388510945380163\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12282031842304776,\n \
\ \"acc_stderr\": 0.009041108602874675\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7640094711917916,\n \"acc_stderr\": 0.011933828850275621\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Open-Orca/LlongOrca-13B-16k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|arc:challenge|25_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T02_39_50.739204
path:
- '**/details_harness|drop|3_2023-09-23T02-39-50.739204.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T02-39-50.739204.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T02_39_50.739204
path:
- '**/details_harness|gsm8k|5_2023-09-23T02-39-50.739204.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T02-39-50.739204.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hellaswag|10_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T10:31:15.034338.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T10_31_15.034338
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T10:31:15.034338.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T10:31:15.034338.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T02_39_50.739204
path:
- '**/details_harness|winogrande|5_2023-09-23T02-39-50.739204.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T02-39-50.739204.parquet'
- config_name: results
data_files:
- split: 2023_09_23T02_39_50.739204
path:
- results_2023-09-23T02-39-50.739204.parquet
- split: latest
path:
- results_2023-09-23T02-39-50.739204.parquet
---
# Dataset Card for Evaluation run of Open-Orca/LlongOrca-13B-16k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Open-Orca/LlongOrca-13B-16k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Open-Orca/LlongOrca-13B-16k](https://huggingface.co/Open-Orca/LlongOrca-13B-16k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Open-Orca__LlongOrca-13B-16k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T02:39:50.739204](https://huggingface.co/datasets/open-llm-leaderboard/details_Open-Orca__LlongOrca-13B-16k/blob/main/results_2023-09-23T02-39-50.739204.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.24842701342281878,
"em_stderr": 0.004425115813837482,
"f1": 0.3159280620805379,
"f1_stderr": 0.004388510945380163,
"acc": 0.4434148948074197,
"acc_stderr": 0.010487468726575147
},
"harness|drop|3": {
"em": 0.24842701342281878,
"em_stderr": 0.004425115813837482,
"f1": 0.3159280620805379,
"f1_stderr": 0.004388510945380163
},
"harness|gsm8k|5": {
"acc": 0.12282031842304776,
"acc_stderr": 0.009041108602874675
},
"harness|winogrande|5": {
"acc": 0.7640094711917916,
"acc_stderr": 0.011933828850275621
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Prasann15479/shephered | 2023-08-27T11:58:41.000Z | [
"region:us"
] | Prasann15479 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_amazon__LightGPT | 2023-09-18T05:09:50.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of amazon/LightGPT
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [amazon/LightGPT](https://huggingface.co/amazon/LightGPT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_amazon__LightGPT\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-18T05:09:39.039109](https://huggingface.co/datasets/open-llm-leaderboard/details_amazon__LightGPT/blob/main/results_2023-09-18T05-09-39.039109.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.053796140939597316,\n\
\ \"em_stderr\": 0.0023105084978365595,\n \"f1\": 0.11191694630872479,\n\
\ \"f1_stderr\": 0.0026210067753728973,\n \"acc\": 0.3417479818067908,\n\
\ \"acc_stderr\": 0.009380315320833657\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.053796140939597316,\n \"em_stderr\": 0.0023105084978365595,\n\
\ \"f1\": 0.11191694630872479,\n \"f1_stderr\": 0.0026210067753728973\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03866565579984837,\n \
\ \"acc_stderr\": 0.005310583162098055\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6448303078137332,\n \"acc_stderr\": 0.013450047479569257\n\
\ }\n}\n```"
repo_url: https://huggingface.co/amazon/LightGPT
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|arc:challenge|25_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_18T05_09_39.039109
path:
- '**/details_harness|drop|3_2023-09-18T05-09-39.039109.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-18T05-09-39.039109.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_18T05_09_39.039109
path:
- '**/details_harness|gsm8k|5_2023-09-18T05-09-39.039109.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-18T05-09-39.039109.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hellaswag|10_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T11:09:14.917369.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T11_09_14.917369
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T11:09:14.917369.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T11:09:14.917369.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_18T05_09_39.039109
path:
- '**/details_harness|winogrande|5_2023-09-18T05-09-39.039109.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-18T05-09-39.039109.parquet'
- config_name: results
data_files:
- split: 2023_09_18T05_09_39.039109
path:
- results_2023-09-18T05-09-39.039109.parquet
- split: latest
path:
- results_2023-09-18T05-09-39.039109.parquet
---
# Dataset Card for Evaluation run of amazon/LightGPT
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/amazon/LightGPT
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [amazon/LightGPT](https://huggingface.co/amazon/LightGPT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_amazon__LightGPT",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T05:09:39.039109](https://huggingface.co/datasets/open-llm-leaderboard/details_amazon__LightGPT/blob/main/results_2023-09-18T05-09-39.039109.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.053796140939597316,
"em_stderr": 0.0023105084978365595,
"f1": 0.11191694630872479,
"f1_stderr": 0.0026210067753728973,
"acc": 0.3417479818067908,
"acc_stderr": 0.009380315320833657
},
"harness|drop|3": {
"em": 0.053796140939597316,
"em_stderr": 0.0023105084978365595,
"f1": 0.11191694630872479,
"f1_stderr": 0.0026210067753728973
},
"harness|gsm8k|5": {
"acc": 0.03866565579984837,
"acc_stderr": 0.005310583162098055
},
"harness|winogrande|5": {
"acc": 0.6448303078137332,
"acc_stderr": 0.013450047479569257
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_ehartford__Samantha-1.11-13b | 2023-08-27T12:44:39.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of ehartford/Samantha-1.11-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/Samantha-1.11-13b](https://huggingface.co/ehartford/Samantha-1.11-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__Samantha-1.11-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-24T08:47:37.032058](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__Samantha-1.11-13b/blob/main/results_2023-08-24T08%3A47%3A37.032058.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5607623224991172,\n\
\ \"acc_stderr\": 0.03433612415303688,\n \"acc_norm\": 0.5649845037245038,\n\
\ \"acc_norm_stderr\": 0.03431394470382357,\n \"mc1\": 0.3353733170134639,\n\
\ \"mc1_stderr\": 0.01652753403966899,\n \"mc2\": 0.4771721773008265,\n\
\ \"mc2_stderr\": 0.014845959298298671\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5639931740614335,\n \"acc_stderr\": 0.014491225699230916,\n\
\ \"acc_norm\": 0.6083617747440273,\n \"acc_norm_stderr\": 0.014264122124938215\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6251742680740888,\n\
\ \"acc_stderr\": 0.004830885704380083,\n \"acc_norm\": 0.8299143596893049,\n\
\ \"acc_norm_stderr\": 0.003749401775087307\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526066,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526066\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.0404633688397825,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.0404633688397825\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854494,\n\
\ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854494\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325583,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325583\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.032500536843658404,\n\
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.032500536843658404\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.335978835978836,\n \"acc_stderr\": 0.024326310529149138,\n \"\
acc_norm\": 0.335978835978836,\n \"acc_norm_stderr\": 0.024326310529149138\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n\
\ \"acc_stderr\": 0.026923446059302837,\n \"acc_norm\": 0.6612903225806451,\n\
\ \"acc_norm_stderr\": 0.026923446059302837\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.39901477832512317,\n \"acc_stderr\": 0.034454876862647144,\n\
\ \"acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.034454876862647144\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0368105086916155,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0368105086916155\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6767676767676768,\n \"acc_stderr\": 0.033322999210706444,\n \"\
acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.033322999210706444\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5307692307692308,\n \"acc_stderr\": 0.025302958890850158,\n\
\ \"acc_norm\": 0.5307692307692308,\n \"acc_norm_stderr\": 0.025302958890850158\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.744954128440367,\n \"acc_stderr\": 0.01868850085653585,\n \"acc_norm\"\
: 0.744954128440367,\n \"acc_norm_stderr\": 0.01868850085653585\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4305555555555556,\n\
\ \"acc_stderr\": 0.03376922151252336,\n \"acc_norm\": 0.4305555555555556,\n\
\ \"acc_norm_stderr\": 0.03376922151252336\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967409,\n\
\ \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967409\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035303,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035303\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.03181149747055359,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.03181149747055359\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.042607351576445594,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.042607351576445594\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.04373313040914761,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.04373313040914761\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.04246624336697625,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.04246624336697625\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326467,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326467\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n\
\ \"acc_stderr\": 0.026655699653922726,\n \"acc_norm\": 0.7905982905982906,\n\
\ \"acc_norm_stderr\": 0.026655699653922726\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7611749680715197,\n\
\ \"acc_stderr\": 0.015246803197398675,\n \"acc_norm\": 0.7611749680715197,\n\
\ \"acc_norm_stderr\": 0.015246803197398675\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.025722802200895806,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.025722802200895806\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2759776536312849,\n\
\ \"acc_stderr\": 0.014950103002475363,\n \"acc_norm\": 0.2759776536312849,\n\
\ \"acc_norm_stderr\": 0.014950103002475363\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.027634176689602656,\n\
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.027634176689602656\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6430868167202572,\n\
\ \"acc_stderr\": 0.027210420375934023,\n \"acc_norm\": 0.6430868167202572,\n\
\ \"acc_norm_stderr\": 0.027210420375934023\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.026725868809100793,\n\
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.026725868809100793\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594106,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594106\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.423728813559322,\n\
\ \"acc_stderr\": 0.012620785155885998,\n \"acc_norm\": 0.423728813559322,\n\
\ \"acc_norm_stderr\": 0.012620785155885998\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.030187532060329387,\n\
\ \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.030187532060329387\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5473856209150327,\n \"acc_stderr\": 0.020136790918492527,\n \
\ \"acc_norm\": 0.5473856209150327,\n \"acc_norm_stderr\": 0.020136790918492527\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.030387262919547728,\n\
\ \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.030387262919547728\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916718,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916718\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3353733170134639,\n\
\ \"mc1_stderr\": 0.01652753403966899,\n \"mc2\": 0.4771721773008265,\n\
\ \"mc2_stderr\": 0.014845959298298671\n }\n}\n```"
repo_url: https://huggingface.co/ehartford/Samantha-1.11-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|arc:challenge|25_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hellaswag|10_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T08:47:37.032058.parquet'
---
# Dataset Card for Evaluation run of ehartford/Samantha-1.11-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/Samantha-1.11-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/Samantha-1.11-13b](https://huggingface.co/ehartford/Samantha-1.11-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__Samantha-1.11-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-24T08:47:37.032058](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__Samantha-1.11-13b/blob/main/results_2023-08-24T08%3A47%3A37.032058.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5607623224991172,
"acc_stderr": 0.03433612415303688,
"acc_norm": 0.5649845037245038,
"acc_norm_stderr": 0.03431394470382357,
"mc1": 0.3353733170134639,
"mc1_stderr": 0.01652753403966899,
"mc2": 0.4771721773008265,
"mc2_stderr": 0.014845959298298671
},
"harness|arc:challenge|25": {
"acc": 0.5639931740614335,
"acc_stderr": 0.014491225699230916,
"acc_norm": 0.6083617747440273,
"acc_norm_stderr": 0.014264122124938215
},
"harness|hellaswag|10": {
"acc": 0.6251742680740888,
"acc_stderr": 0.004830885704380083,
"acc_norm": 0.8299143596893049,
"acc_norm_stderr": 0.003749401775087307
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.0404633688397825,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.0404633688397825
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5924528301886792,
"acc_stderr": 0.030242233800854494,
"acc_norm": 0.5924528301886792,
"acc_norm_stderr": 0.030242233800854494
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325583,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325583
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.032500536843658404,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.032500536843658404
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.335978835978836,
"acc_stderr": 0.024326310529149138,
"acc_norm": 0.335978835978836,
"acc_norm_stderr": 0.024326310529149138
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6612903225806451,
"acc_stderr": 0.026923446059302837,
"acc_norm": 0.6612903225806451,
"acc_norm_stderr": 0.026923446059302837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39901477832512317,
"acc_stderr": 0.034454876862647144,
"acc_norm": 0.39901477832512317,
"acc_norm_stderr": 0.034454876862647144
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0368105086916155,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0368105086916155
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.033322999210706444,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.033322999210706444
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5307692307692308,
"acc_stderr": 0.025302958890850158,
"acc_norm": 0.5307692307692308,
"acc_norm_stderr": 0.025302958890850158
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.744954128440367,
"acc_stderr": 0.01868850085653585,
"acc_norm": 0.744954128440367,
"acc_norm_stderr": 0.01868850085653585
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967409,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967409
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.028304657943035303,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.028304657943035303
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.03181149747055359,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.03181149747055359
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.042607351576445594,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.042607351576445594
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.04373313040914761,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.04373313040914761
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.04246624336697625,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.04246624336697625
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326467,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326467
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.026655699653922726,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.026655699653922726
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7611749680715197,
"acc_stderr": 0.015246803197398675,
"acc_norm": 0.7611749680715197,
"acc_norm_stderr": 0.015246803197398675
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.025722802200895806,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.025722802200895806
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2759776536312849,
"acc_stderr": 0.014950103002475363,
"acc_norm": 0.2759776536312849,
"acc_norm_stderr": 0.014950103002475363
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.027634176689602656,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.027634176689602656
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6430868167202572,
"acc_stderr": 0.027210420375934023,
"acc_norm": 0.6430868167202572,
"acc_norm_stderr": 0.027210420375934023
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.026725868809100793,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.026725868809100793
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.029609912075594106,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.029609912075594106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.423728813559322,
"acc_stderr": 0.012620785155885998,
"acc_norm": 0.423728813559322,
"acc_norm_stderr": 0.012620785155885998
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5551470588235294,
"acc_stderr": 0.030187532060329387,
"acc_norm": 0.5551470588235294,
"acc_norm_stderr": 0.030187532060329387
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5473856209150327,
"acc_stderr": 0.020136790918492527,
"acc_norm": 0.5473856209150327,
"acc_norm_stderr": 0.020136790918492527
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.030387262919547728,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.030387262919547728
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916718,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916718
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3353733170134639,
"mc1_stderr": 0.01652753403966899,
"mc2": 0.4771721773008265,
"mc2_stderr": 0.014845959298298671
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_ehartford__Samantha-1.11-70b | 2023-08-27T12:44:41.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of ehartford/Samantha-1.11-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/Samantha-1.11-70b](https://huggingface.co/ehartford/Samantha-1.11-70b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__Samantha-1.11-70b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-23T18:30:58.468070](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__Samantha-1.11-70b/blob/main/results_2023-08-23T18%3A30%3A58.468070.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.677813015899243,\n\
\ \"acc_stderr\": 0.031237474871293903,\n \"acc_norm\": 0.6818837535900895,\n\
\ \"acc_norm_stderr\": 0.031205668786764153,\n \"mc1\": 0.4663402692778458,\n\
\ \"mc1_stderr\": 0.017463793867168106,\n \"mc2\": 0.6501725704722767,\n\
\ \"mc2_stderr\": 0.014792841820249373\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6527303754266212,\n \"acc_stderr\": 0.013913034529620451,\n\
\ \"acc_norm\": 0.7005119453924915,\n \"acc_norm_stderr\": 0.013385021637313574\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6831308504282015,\n\
\ \"acc_stderr\": 0.004643050902503911,\n \"acc_norm\": 0.8755228042222665,\n\
\ \"acc_norm_stderr\": 0.003294504807555238\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8092105263157895,\n \"acc_stderr\": 0.031975658210325,\n\
\ \"acc_norm\": 0.8092105263157895,\n \"acc_norm_stderr\": 0.031975658210325\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03309615177059007,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03309615177059007\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6468085106382979,\n \"acc_stderr\": 0.031245325202761926,\n\
\ \"acc_norm\": 0.6468085106382979,\n \"acc_norm_stderr\": 0.031245325202761926\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419036,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419036\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n\
\ \"acc_stderr\": 0.02233170761182307,\n \"acc_norm\": 0.8096774193548387,\n\
\ \"acc_norm_stderr\": 0.02233170761182307\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865383,\n\
\ \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865383\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6974358974358974,\n \"acc_stderr\": 0.023290888053772725,\n\
\ \"acc_norm\": 0.6974358974358974,\n \"acc_norm_stderr\": 0.023290888053772725\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.027553614467863814,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.027553614467863814\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.423841059602649,\n \"acc_stderr\": 0.04034846678603397,\n \"acc_norm\"\
: 0.423841059602649,\n \"acc_norm_stderr\": 0.04034846678603397\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8807339449541285,\n\
\ \"acc_stderr\": 0.01389572929258896,\n \"acc_norm\": 0.8807339449541285,\n\
\ \"acc_norm_stderr\": 0.01389572929258896\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n\
\ \"acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8970588235294118,\n \"acc_stderr\": 0.02132833757080437,\n \"\
acc_norm\": 0.8970588235294118,\n \"acc_norm_stderr\": 0.02132833757080437\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8734177215189873,\n \"acc_stderr\": 0.021644195727955173,\n \
\ \"acc_norm\": 0.8734177215189873,\n \"acc_norm_stderr\": 0.021644195727955173\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7847533632286996,\n\
\ \"acc_stderr\": 0.027584066602208263,\n \"acc_norm\": 0.7847533632286996,\n\
\ \"acc_norm_stderr\": 0.027584066602208263\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.0321782942074463,\n\
\ \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.0321782942074463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.03008309871603521,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.03008309871603521\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8220858895705522,\n \"acc_stderr\": 0.03004735765580663,\n\
\ \"acc_norm\": 0.8220858895705522,\n \"acc_norm_stderr\": 0.03004735765580663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n\
\ \"acc_stderr\": 0.01911989279892498,\n \"acc_norm\": 0.905982905982906,\n\
\ \"acc_norm_stderr\": 0.01911989279892498\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.859514687100894,\n\
\ \"acc_stderr\": 0.012426211353093443,\n \"acc_norm\": 0.859514687100894,\n\
\ \"acc_norm_stderr\": 0.012426211353093443\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.02353292543104428,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.02353292543104428\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5094972067039106,\n\
\ \"acc_stderr\": 0.016719484643348777,\n \"acc_norm\": 0.5094972067039106,\n\
\ \"acc_norm_stderr\": 0.016719484643348777\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7331189710610932,\n\
\ \"acc_stderr\": 0.025122637608816646,\n \"acc_norm\": 0.7331189710610932,\n\
\ \"acc_norm_stderr\": 0.025122637608816646\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8117283950617284,\n \"acc_stderr\": 0.021751866060815885,\n\
\ \"acc_norm\": 0.8117283950617284,\n \"acc_norm_stderr\": 0.021751866060815885\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.5410691003911343,\n \"acc_stderr\": 0.012727084826799802,\n\
\ \"acc_norm\": 0.5410691003911343,\n \"acc_norm_stderr\": 0.012727084826799802\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6948529411764706,\n \"acc_stderr\": 0.0279715413701706,\n \"acc_norm\"\
: 0.6948529411764706,\n \"acc_norm_stderr\": 0.0279715413701706\n },\n\
\ \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7516339869281046,\n\
\ \"acc_stderr\": 0.017479487001364764,\n \"acc_norm\": 0.7516339869281046,\n\
\ \"acc_norm_stderr\": 0.017479487001364764\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n\
\ \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7795918367346939,\n\
\ \"acc_stderr\": 0.02653704531214529,\n \"acc_norm\": 0.7795918367346939,\n\
\ \"acc_norm_stderr\": 0.02653704531214529\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101716,\n\
\ \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101716\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n\
\ \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n\
\ \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8713450292397661,\n\
\ \"acc_stderr\": 0.02567934272327692,\n \"acc_norm\": 0.8713450292397661,\n\
\ \"acc_norm_stderr\": 0.02567934272327692\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.4663402692778458,\n \"mc1_stderr\": 0.017463793867168106,\n\
\ \"mc2\": 0.6501725704722767,\n \"mc2_stderr\": 0.014792841820249373\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ehartford/Samantha-1.11-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|arc:challenge|25_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hellaswag|10_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T18:30:58.468070.parquet'
---
# Dataset Card for Evaluation run of ehartford/Samantha-1.11-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/Samantha-1.11-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/Samantha-1.11-70b](https://huggingface.co/ehartford/Samantha-1.11-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__Samantha-1.11-70b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-23T18:30:58.468070](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__Samantha-1.11-70b/blob/main/results_2023-08-23T18%3A30%3A58.468070.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.677813015899243,
"acc_stderr": 0.031237474871293903,
"acc_norm": 0.6818837535900895,
"acc_norm_stderr": 0.031205668786764153,
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168106,
"mc2": 0.6501725704722767,
"mc2_stderr": 0.014792841820249373
},
"harness|arc:challenge|25": {
"acc": 0.6527303754266212,
"acc_stderr": 0.013913034529620451,
"acc_norm": 0.7005119453924915,
"acc_norm_stderr": 0.013385021637313574
},
"harness|hellaswag|10": {
"acc": 0.6831308504282015,
"acc_stderr": 0.004643050902503911,
"acc_norm": 0.8755228042222665,
"acc_norm_stderr": 0.003294504807555238
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8092105263157895,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.8092105263157895,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03309615177059007,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03309615177059007
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.037143259063020656,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.037143259063020656
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6468085106382979,
"acc_stderr": 0.031245325202761926,
"acc_norm": 0.6468085106382979,
"acc_norm_stderr": 0.031245325202761926
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419036,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419036
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.02233170761182307,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.02233170761182307
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865383,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865383
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240528,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6974358974358974,
"acc_stderr": 0.023290888053772725,
"acc_norm": 0.6974358974358974,
"acc_norm_stderr": 0.023290888053772725
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.027553614467863814,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.027553614467863814
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.423841059602649,
"acc_stderr": 0.04034846678603397,
"acc_norm": 0.423841059602649,
"acc_norm_stderr": 0.04034846678603397
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8807339449541285,
"acc_stderr": 0.01389572929258896,
"acc_norm": 0.8807339449541285,
"acc_norm_stderr": 0.01389572929258896
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8970588235294118,
"acc_stderr": 0.02132833757080437,
"acc_norm": 0.8970588235294118,
"acc_norm_stderr": 0.02132833757080437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8734177215189873,
"acc_stderr": 0.021644195727955173,
"acc_norm": 0.8734177215189873,
"acc_norm_stderr": 0.021644195727955173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7847533632286996,
"acc_stderr": 0.027584066602208263,
"acc_norm": 0.7847533632286996,
"acc_norm_stderr": 0.027584066602208263
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.0321782942074463,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.0321782942074463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.03008309871603521,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.03008309871603521
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8220858895705522,
"acc_stderr": 0.03004735765580663,
"acc_norm": 0.8220858895705522,
"acc_norm_stderr": 0.03004735765580663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.01911989279892498,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.01911989279892498
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.859514687100894,
"acc_stderr": 0.012426211353093443,
"acc_norm": 0.859514687100894,
"acc_norm_stderr": 0.012426211353093443
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.02353292543104428,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.02353292543104428
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5094972067039106,
"acc_stderr": 0.016719484643348777,
"acc_norm": 0.5094972067039106,
"acc_norm_stderr": 0.016719484643348777
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7331189710610932,
"acc_stderr": 0.025122637608816646,
"acc_norm": 0.7331189710610932,
"acc_norm_stderr": 0.025122637608816646
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8117283950617284,
"acc_stderr": 0.021751866060815885,
"acc_norm": 0.8117283950617284,
"acc_norm_stderr": 0.021751866060815885
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5410691003911343,
"acc_stderr": 0.012727084826799802,
"acc_norm": 0.5410691003911343,
"acc_norm_stderr": 0.012727084826799802
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.0279715413701706,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.0279715413701706
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7795918367346939,
"acc_stderr": 0.02653704531214529,
"acc_norm": 0.7795918367346939,
"acc_norm_stderr": 0.02653704531214529
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101716,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101716
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.02567934272327692,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.02567934272327692
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168106,
"mc2": 0.6501725704722767,
"mc2_stderr": 0.014792841820249373
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_DevaMalla__llama-base-7b | 2023-09-16T19:28:56.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of DevaMalla/llama-base-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DevaMalla/llama-base-7b](https://huggingface.co/DevaMalla/llama-base-7b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DevaMalla__llama-base-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T19:28:44.136872](https://huggingface.co/datasets/open-llm-leaderboard/details_DevaMalla__llama-base-7b/blob/main/results_2023-09-16T19-28-44.136872.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n\
\ \"em_stderr\": 0.0003314581465219126,\n \"f1\": 0.056186031879194784,\n\
\ \"f1_stderr\": 0.0012858243614759428,\n \"acc\": 0.3749593848153363,\n\
\ \"acc_stderr\": 0.008901319861891403\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219126,\n\
\ \"f1\": 0.056186031879194784,\n \"f1_stderr\": 0.0012858243614759428\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0356330553449583,\n \
\ \"acc_stderr\": 0.00510610785374419\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.012696531870038616\n\
\ }\n}\n```"
repo_url: https://huggingface.co/DevaMalla/llama-base-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|arc:challenge|25_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T19_28_44.136872
path:
- '**/details_harness|drop|3_2023-09-16T19-28-44.136872.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T19-28-44.136872.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T19_28_44.136872
path:
- '**/details_harness|gsm8k|5_2023-09-16T19-28-44.136872.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T19-28-44.136872.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hellaswag|10_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:24:25.697077.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T13_24_25.697077
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T13:24:25.697077.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T13:24:25.697077.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T19_28_44.136872
path:
- '**/details_harness|winogrande|5_2023-09-16T19-28-44.136872.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T19-28-44.136872.parquet'
- config_name: results
data_files:
- split: 2023_09_16T19_28_44.136872
path:
- results_2023-09-16T19-28-44.136872.parquet
- split: latest
path:
- results_2023-09-16T19-28-44.136872.parquet
---
# Dataset Card for Evaluation run of DevaMalla/llama-base-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/DevaMalla/llama-base-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [DevaMalla/llama-base-7b](https://huggingface.co/DevaMalla/llama-base-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DevaMalla__llama-base-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T19:28:44.136872](https://huggingface.co/datasets/open-llm-leaderboard/details_DevaMalla__llama-base-7b/blob/main/results_2023-09-16T19-28-44.136872.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219126,
"f1": 0.056186031879194784,
"f1_stderr": 0.0012858243614759428,
"acc": 0.3749593848153363,
"acc_stderr": 0.008901319861891403
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219126,
"f1": 0.056186031879194784,
"f1_stderr": 0.0012858243614759428
},
"harness|gsm8k|5": {
"acc": 0.0356330553449583,
"acc_stderr": 0.00510610785374419
},
"harness|winogrande|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.012696531870038616
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_lmsys__vicuna-33b-v1.3 | 2023-09-17T00:50:41.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of lmsys/vicuna-33b-v1.3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lmsys/vicuna-33b-v1.3](https://huggingface.co/lmsys/vicuna-33b-v1.3) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lmsys__vicuna-33b-v1.3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T00:50:29.265762](https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__vicuna-33b-v1.3/blob/main/results_2023-09-17T00-50-29.265762.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.24611996644295303,\n\
\ \"em_stderr\": 0.004411275638567265,\n \"f1\": 0.3191652684563765,\n\
\ \"f1_stderr\": 0.004369271114420946,\n \"acc\": 0.4537743848183282,\n\
\ \"acc_stderr\": 0.010649726923219326\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.24611996644295303,\n \"em_stderr\": 0.004411275638567265,\n\
\ \"f1\": 0.3191652684563765,\n \"f1_stderr\": 0.004369271114420946\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1372251705837756,\n \
\ \"acc_stderr\": 0.00947780824460041\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838243\n\
\ }\n}\n```"
repo_url: https://huggingface.co/lmsys/vicuna-33b-v1.3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|arc:challenge|25_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T00_50_29.265762
path:
- '**/details_harness|drop|3_2023-09-17T00-50-29.265762.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T00-50-29.265762.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T00_50_29.265762
path:
- '**/details_harness|gsm8k|5_2023-09-17T00-50-29.265762.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T00-50-29.265762.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hellaswag|10_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T14:55:51.049874.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T14_55_51.049874
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T14:55:51.049874.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T14:55:51.049874.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T00_50_29.265762
path:
- '**/details_harness|winogrande|5_2023-09-17T00-50-29.265762.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T00-50-29.265762.parquet'
- config_name: results
data_files:
- split: 2023_09_17T00_50_29.265762
path:
- results_2023-09-17T00-50-29.265762.parquet
- split: latest
path:
- results_2023-09-17T00-50-29.265762.parquet
---
# Dataset Card for Evaluation run of lmsys/vicuna-33b-v1.3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lmsys/vicuna-33b-v1.3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lmsys/vicuna-33b-v1.3](https://huggingface.co/lmsys/vicuna-33b-v1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lmsys__vicuna-33b-v1.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T00:50:29.265762](https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__vicuna-33b-v1.3/blob/main/results_2023-09-17T00-50-29.265762.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.24611996644295303,
"em_stderr": 0.004411275638567265,
"f1": 0.3191652684563765,
"f1_stderr": 0.004369271114420946,
"acc": 0.4537743848183282,
"acc_stderr": 0.010649726923219326
},
"harness|drop|3": {
"em": 0.24611996644295303,
"em_stderr": 0.004411275638567265,
"f1": 0.3191652684563765,
"f1_stderr": 0.004369271114420946
},
"harness|gsm8k|5": {
"acc": 0.1372251705837756,
"acc_stderr": 0.00947780824460041
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.011821645601838243
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_nomic-ai__gpt4all-j | 2023-10-03T19:48:34.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of nomic-ai/gpt4all-j
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nomic-ai/gpt4all-j](https://huggingface.co/nomic-ai/gpt4all-j) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nomic-ai__gpt4all-j\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T19:47:17.747407](https://huggingface.co/datasets/open-llm-leaderboard/details_nomic-ai__gpt4all-j/blob/main/results_2023-10-03T19-47-17.747407.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2872188662344035,\n\
\ \"acc_stderr\": 0.03256180276717163,\n \"acc_norm\": 0.2903884169662422,\n\
\ \"acc_norm_stderr\": 0.03256130411949783,\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.015764770836777305,\n \"mc2\": 0.4277581711709451,\n\
\ \"mc2_stderr\": 0.014665895347989117\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.38993174061433444,\n \"acc_stderr\": 0.014252959848892884,\n\
\ \"acc_norm\": 0.4197952218430034,\n \"acc_norm_stderr\": 0.014422181226303022\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.483469428400717,\n\
\ \"acc_stderr\": 0.004987053652540272,\n \"acc_norm\": 0.6406094403505278,\n\
\ \"acc_norm_stderr\": 0.004788412062375707\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.35555555555555557,\n\
\ \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.35555555555555557,\n\
\ \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.32452830188679244,\n \"acc_stderr\": 0.028815615713432118,\n\
\ \"acc_norm\": 0.32452830188679244,\n \"acc_norm_stderr\": 0.028815615713432118\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2986111111111111,\n\
\ \"acc_stderr\": 0.038270523579507554,\n \"acc_norm\": 0.2986111111111111,\n\
\ \"acc_norm_stderr\": 0.038270523579507554\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\"\
: 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793254,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793254\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.251063829787234,\n \"acc_stderr\": 0.028346963777162452,\n\
\ \"acc_norm\": 0.251063829787234,\n \"acc_norm_stderr\": 0.028346963777162452\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748142,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748142\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.35172413793103446,\n \"acc_stderr\": 0.03979236637497412,\n\
\ \"acc_norm\": 0.35172413793103446,\n \"acc_norm_stderr\": 0.03979236637497412\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.21164021164021163,\n \"acc_stderr\": 0.021037331505262893,\n \"\
acc_norm\": 0.21164021164021163,\n \"acc_norm_stderr\": 0.021037331505262893\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.038522733649243156,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.038522733649243156\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2645161290322581,\n\
\ \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.2645161290322581,\n\
\ \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.29064039408866993,\n \"acc_stderr\": 0.0319474007226554,\n\
\ \"acc_norm\": 0.29064039408866993,\n \"acc_norm_stderr\": 0.0319474007226554\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\"\
: 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3383838383838384,\n \"acc_stderr\": 0.03371124142626302,\n \"\
acc_norm\": 0.3383838383838384,\n \"acc_norm_stderr\": 0.03371124142626302\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2694300518134715,\n \"acc_stderr\": 0.03201867122877794,\n\
\ \"acc_norm\": 0.2694300518134715,\n \"acc_norm_stderr\": 0.03201867122877794\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.32051282051282054,\n \"acc_stderr\": 0.023661296393964273,\n\
\ \"acc_norm\": 0.32051282051282054,\n \"acc_norm_stderr\": 0.023661296393964273\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3067226890756303,\n \"acc_stderr\": 0.029953823891887037,\n\
\ \"acc_norm\": 0.3067226890756303,\n \"acc_norm_stderr\": 0.029953823891887037\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23486238532110093,\n \"acc_stderr\": 0.018175110510343588,\n \"\
acc_norm\": 0.23486238532110093,\n \"acc_norm_stderr\": 0.018175110510343588\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604257,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604257\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.27848101265822783,\n \"acc_stderr\": 0.02917868230484255,\n \
\ \"acc_norm\": 0.27848101265822783,\n \"acc_norm_stderr\": 0.02917868230484255\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.25112107623318386,\n\
\ \"acc_stderr\": 0.029105220833224595,\n \"acc_norm\": 0.25112107623318386,\n\
\ \"acc_norm_stderr\": 0.029105220833224595\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.33884297520661155,\n \"acc_stderr\": 0.04320767807536669,\n \"\
acc_norm\": 0.33884297520661155,\n \"acc_norm_stderr\": 0.04320767807536669\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2037037037037037,\n\
\ \"acc_stderr\": 0.038935425188248475,\n \"acc_norm\": 0.2037037037037037,\n\
\ \"acc_norm_stderr\": 0.038935425188248475\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.038946411200447915,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.038946411200447915\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n\
\ \"acc_stderr\": 0.029614323690456648,\n \"acc_norm\": 0.2863247863247863,\n\
\ \"acc_norm_stderr\": 0.029614323690456648\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.015671006009339586,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.015671006009339586\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2398843930635838,\n \"acc_stderr\": 0.022989592543123563,\n\
\ \"acc_norm\": 0.2398843930635838,\n \"acc_norm_stderr\": 0.022989592543123563\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3366013071895425,\n \"acc_stderr\": 0.027057974624494382,\n\
\ \"acc_norm\": 0.3366013071895425,\n \"acc_norm_stderr\": 0.027057974624494382\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3247588424437299,\n\
\ \"acc_stderr\": 0.026596782287697046,\n \"acc_norm\": 0.3247588424437299,\n\
\ \"acc_norm_stderr\": 0.026596782287697046\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.025630824975621348,\n\
\ \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.025630824975621348\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2872340425531915,\n \"acc_stderr\": 0.026992199173064356,\n \
\ \"acc_norm\": 0.2872340425531915,\n \"acc_norm_stderr\": 0.026992199173064356\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2516297262059974,\n\
\ \"acc_stderr\": 0.011083276280441912,\n \"acc_norm\": 0.2516297262059974,\n\
\ \"acc_norm_stderr\": 0.011083276280441912\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.39705882352941174,\n \"acc_stderr\": 0.029722152099280055,\n\
\ \"acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.029722152099280055\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.03831305140884603,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.03831305140884603\n },\n\
\ \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3673469387755102,\n\
\ \"acc_stderr\": 0.030862144921087558,\n \"acc_norm\": 0.3673469387755102,\n\
\ \"acc_norm_stderr\": 0.030862144921087558\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.3482587064676617,\n \"acc_stderr\": 0.033687874661154596,\n\
\ \"acc_norm\": 0.3482587064676617,\n \"acc_norm_stderr\": 0.033687874661154596\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.2289156626506024,\n \"acc_stderr\": 0.03270745277352477,\n\
\ \"acc_norm\": 0.2289156626506024,\n \"acc_norm_stderr\": 0.03270745277352477\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.30994152046783624,\n\
\ \"acc_stderr\": 0.03546976959393163,\n \"acc_norm\": 0.30994152046783624,\n\
\ \"acc_norm_stderr\": 0.03546976959393163\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2827417380660955,\n \"mc1_stderr\": 0.015764770836777305,\n\
\ \"mc2\": 0.4277581711709451,\n \"mc2_stderr\": 0.014665895347989117\n\
\ }\n}\n```"
repo_url: https://huggingface.co/nomic-ai/gpt4all-j
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|arc:challenge|25_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_18T00_56_07.678298
path:
- '**/details_harness|drop|3_2023-09-18T00-56-07.678298.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-18T00-56-07.678298.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_18T00_56_07.678298
path:
- '**/details_harness|gsm8k|5_2023-09-18T00-56-07.678298.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-18T00-56-07.678298.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hellaswag|10_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_18T00_56_07.678298
path:
- '**/details_harness|winogrande|5_2023-09-18T00-56-07.678298.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-18T00-56-07.678298.parquet'
- config_name: results
data_files:
- split: 2023_09_18T00_56_07.678298
path:
- results_2023-09-18T00-56-07.678298.parquet
- split: 2023_10_03T19_47_17.747407
path:
- results_2023-10-03T19-47-17.747407.parquet
- split: latest
path:
- results_2023-10-03T19-47-17.747407.parquet
---
# Dataset Card for Evaluation run of nomic-ai/gpt4all-j
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/nomic-ai/gpt4all-j
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [nomic-ai/gpt4all-j](https://huggingface.co/nomic-ai/gpt4all-j) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nomic-ai__gpt4all-j",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T19:47:17.747407](https://huggingface.co/datasets/open-llm-leaderboard/details_nomic-ai__gpt4all-j/blob/main/results_2023-10-03T19-47-17.747407.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2872188662344035,
"acc_stderr": 0.03256180276717163,
"acc_norm": 0.2903884169662422,
"acc_norm_stderr": 0.03256130411949783,
"mc1": 0.2827417380660955,
"mc1_stderr": 0.015764770836777305,
"mc2": 0.4277581711709451,
"mc2_stderr": 0.014665895347989117
},
"harness|arc:challenge|25": {
"acc": 0.38993174061433444,
"acc_stderr": 0.014252959848892884,
"acc_norm": 0.4197952218430034,
"acc_norm_stderr": 0.014422181226303022
},
"harness|hellaswag|10": {
"acc": 0.483469428400717,
"acc_stderr": 0.004987053652540272,
"acc_norm": 0.6406094403505278,
"acc_norm_stderr": 0.004788412062375707
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.04135176749720386,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.04135176749720386
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.32452830188679244,
"acc_stderr": 0.028815615713432118,
"acc_norm": 0.32452830188679244,
"acc_norm_stderr": 0.028815615713432118
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2986111111111111,
"acc_stderr": 0.038270523579507554,
"acc_norm": 0.2986111111111111,
"acc_norm_stderr": 0.038270523579507554
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793254,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793254
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.251063829787234,
"acc_stderr": 0.028346963777162452,
"acc_norm": 0.251063829787234,
"acc_norm_stderr": 0.028346963777162452
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748142,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748142
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.35172413793103446,
"acc_stderr": 0.03979236637497412,
"acc_norm": 0.35172413793103446,
"acc_norm_stderr": 0.03979236637497412
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21164021164021163,
"acc_stderr": 0.021037331505262893,
"acc_norm": 0.21164021164021163,
"acc_norm_stderr": 0.021037331505262893
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.038522733649243156,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.038522733649243156
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2645161290322581,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.2645161290322581,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.29064039408866993,
"acc_stderr": 0.0319474007226554,
"acc_norm": 0.29064039408866993,
"acc_norm_stderr": 0.0319474007226554
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3383838383838384,
"acc_stderr": 0.03371124142626302,
"acc_norm": 0.3383838383838384,
"acc_norm_stderr": 0.03371124142626302
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2694300518134715,
"acc_stderr": 0.03201867122877794,
"acc_norm": 0.2694300518134715,
"acc_norm_stderr": 0.03201867122877794
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.32051282051282054,
"acc_stderr": 0.023661296393964273,
"acc_norm": 0.32051282051282054,
"acc_norm_stderr": 0.023661296393964273
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3067226890756303,
"acc_stderr": 0.029953823891887037,
"acc_norm": 0.3067226890756303,
"acc_norm_stderr": 0.029953823891887037
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23486238532110093,
"acc_stderr": 0.018175110510343588,
"acc_norm": 0.23486238532110093,
"acc_norm_stderr": 0.018175110510343588
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604257,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604257
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.27848101265822783,
"acc_stderr": 0.02917868230484255,
"acc_norm": 0.27848101265822783,
"acc_norm_stderr": 0.02917868230484255
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.25112107623318386,
"acc_stderr": 0.029105220833224595,
"acc_norm": 0.25112107623318386,
"acc_norm_stderr": 0.029105220833224595
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.33884297520661155,
"acc_stderr": 0.04320767807536669,
"acc_norm": 0.33884297520661155,
"acc_norm_stderr": 0.04320767807536669
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.038935425188248475,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.038935425188248475
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.038946411200447915,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.038946411200447915
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2863247863247863,
"acc_stderr": 0.029614323690456648,
"acc_norm": 0.2863247863247863,
"acc_norm_stderr": 0.029614323690456648
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.015671006009339586,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.015671006009339586
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2398843930635838,
"acc_stderr": 0.022989592543123563,
"acc_norm": 0.2398843930635838,
"acc_norm_stderr": 0.022989592543123563
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3366013071895425,
"acc_stderr": 0.027057974624494382,
"acc_norm": 0.3366013071895425,
"acc_norm_stderr": 0.027057974624494382
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3247588424437299,
"acc_stderr": 0.026596782287697046,
"acc_norm": 0.3247588424437299,
"acc_norm_stderr": 0.026596782287697046
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.025630824975621348,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.025630824975621348
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2872340425531915,
"acc_stderr": 0.026992199173064356,
"acc_norm": 0.2872340425531915,
"acc_norm_stderr": 0.026992199173064356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2516297262059974,
"acc_stderr": 0.011083276280441912,
"acc_norm": 0.2516297262059974,
"acc_norm_stderr": 0.011083276280441912
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.029722152099280055,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.029722152099280055
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.03831305140884603,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03831305140884603
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3673469387755102,
"acc_stderr": 0.030862144921087558,
"acc_norm": 0.3673469387755102,
"acc_norm_stderr": 0.030862144921087558
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3482587064676617,
"acc_stderr": 0.033687874661154596,
"acc_norm": 0.3482587064676617,
"acc_norm_stderr": 0.033687874661154596
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2289156626506024,
"acc_stderr": 0.03270745277352477,
"acc_norm": 0.2289156626506024,
"acc_norm_stderr": 0.03270745277352477
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30994152046783624,
"acc_stderr": 0.03546976959393163,
"acc_norm": 0.30994152046783624,
"acc_norm_stderr": 0.03546976959393163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2827417380660955,
"mc1_stderr": 0.015764770836777305,
"mc2": 0.4277581711709451,
"mc2_stderr": 0.014665895347989117
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_facebook__opt-iml-max-1.3b | 2023-08-27T12:44:48.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of facebook/opt-iml-max-1.3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [facebook/opt-iml-max-1.3b](https://huggingface.co/facebook/opt-iml-max-1.3b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_facebook__opt-iml-max-1.3b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-22T09:51:53.668877](https://huggingface.co/datasets/open-llm-leaderboard/details_facebook__opt-iml-max-1.3b/blob/main/results_2023-08-22T09%3A51%3A53.668877.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27830572189653424,\n\
\ \"acc_stderr\": 0.03226567604110693,\n \"acc_norm\": 0.2810728038026702,\n\
\ \"acc_norm_stderr\": 0.0322752387176325,\n \"mc1\": 0.22766217870257038,\n\
\ \"mc1_stderr\": 0.014679255032111066,\n \"mc2\": 0.3834241864559234,\n\
\ \"mc2_stderr\": 0.014237766301129044\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.27047781569965873,\n \"acc_stderr\": 0.012980954547659554,\n\
\ \"acc_norm\": 0.30716723549488056,\n \"acc_norm_stderr\": 0.013481034054980945\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4115714001194981,\n\
\ \"acc_stderr\": 0.004911125101064648,\n \"acc_norm\": 0.5381398127862975,\n\
\ \"acc_norm_stderr\": 0.004975243508752004\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.15,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.03749850709174022,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.03749850709174022\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.1513157894736842,\n \"acc_stderr\": 0.029162631596843982,\n\
\ \"acc_norm\": 0.1513157894736842,\n \"acc_norm_stderr\": 0.029162631596843982\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.32452830188679244,\n \"acc_stderr\": 0.028815615713432115,\n\
\ \"acc_norm\": 0.32452830188679244,\n \"acc_norm_stderr\": 0.028815615713432115\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.03745554791462457,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.03745554791462457\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.28085106382978725,\n \"acc_stderr\": 0.02937917046412483,\n\
\ \"acc_norm\": 0.28085106382978725,\n \"acc_norm_stderr\": 0.02937917046412483\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.03892431106518752,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.03892431106518752\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n\
\ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\
\ \"acc_stderr\": 0.03567016675276864,\n \"acc_norm\": 0.1984126984126984,\n\
\ \"acc_norm_stderr\": 0.03567016675276864\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.32903225806451614,\n \"acc_stderr\": 0.026729499068349972,\n \"\
acc_norm\": 0.32903225806451614,\n \"acc_norm_stderr\": 0.026729499068349972\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.18226600985221675,\n \"acc_stderr\": 0.02716334085964515,\n \"\
acc_norm\": 0.18226600985221675,\n \"acc_norm_stderr\": 0.02716334085964515\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.296969696969697,\n \"acc_stderr\": 0.03567969772268048,\n\
\ \"acc_norm\": 0.296969696969697,\n \"acc_norm_stderr\": 0.03567969772268048\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.22727272727272727,\n \"acc_stderr\": 0.02985751567338641,\n \"\
acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.02985751567338641\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.38341968911917096,\n \"acc_stderr\": 0.03508984236295341,\n\
\ \"acc_norm\": 0.38341968911917096,\n \"acc_norm_stderr\": 0.03508984236295341\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.30256410256410254,\n \"acc_stderr\": 0.02329088805377274,\n\
\ \"acc_norm\": 0.30256410256410254,\n \"acc_norm_stderr\": 0.02329088805377274\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.25210084033613445,\n \"acc_stderr\": 0.028205545033277723,\n\
\ \"acc_norm\": 0.25210084033613445,\n \"acc_norm_stderr\": 0.028205545033277723\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"\
acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3357798165137615,\n \"acc_stderr\": 0.02024808139675293,\n \"\
acc_norm\": 0.3357798165137615,\n \"acc_norm_stderr\": 0.02024808139675293\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.28921568627450983,\n \"acc_stderr\": 0.03182231867647554,\n \"\
acc_norm\": 0.28921568627450983,\n \"acc_norm_stderr\": 0.03182231867647554\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.35864978902953587,\n \"acc_stderr\": 0.03121956944530184,\n \
\ \"acc_norm\": 0.35864978902953587,\n \"acc_norm_stderr\": 0.03121956944530184\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3452914798206278,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.3452914798206278,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3435114503816794,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.3435114503816794,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.20535714285714285,\n\
\ \"acc_stderr\": 0.03834241021419073,\n \"acc_norm\": 0.20535714285714285,\n\
\ \"acc_norm_stderr\": 0.03834241021419073\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.048026946982589726,\n\
\ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.048026946982589726\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23931623931623933,\n\
\ \"acc_stderr\": 0.027951826808924333,\n \"acc_norm\": 0.23931623931623933,\n\
\ \"acc_norm_stderr\": 0.027951826808924333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.32567049808429116,\n\
\ \"acc_stderr\": 0.016757989458549682,\n \"acc_norm\": 0.32567049808429116,\n\
\ \"acc_norm_stderr\": 0.016757989458549682\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2658959537572254,\n \"acc_stderr\": 0.023786203255508277,\n\
\ \"acc_norm\": 0.2658959537572254,\n \"acc_norm_stderr\": 0.023786203255508277\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.21675977653631284,\n\
\ \"acc_stderr\": 0.01378059848644334,\n \"acc_norm\": 0.21675977653631284,\n\
\ \"acc_norm_stderr\": 0.01378059848644334\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.32679738562091504,\n \"acc_stderr\": 0.026857294663281416,\n\
\ \"acc_norm\": 0.32679738562091504,\n \"acc_norm_stderr\": 0.026857294663281416\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19614147909967847,\n\
\ \"acc_stderr\": 0.02255244778047803,\n \"acc_norm\": 0.19614147909967847,\n\
\ \"acc_norm_stderr\": 0.02255244778047803\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.27469135802469136,\n \"acc_stderr\": 0.024836057868294677,\n\
\ \"acc_norm\": 0.27469135802469136,\n \"acc_norm_stderr\": 0.024836057868294677\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2801418439716312,\n \"acc_stderr\": 0.02678917235114025,\n \
\ \"acc_norm\": 0.2801418439716312,\n \"acc_norm_stderr\": 0.02678917235114025\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2470664928292047,\n\
\ \"acc_stderr\": 0.011015752255279343,\n \"acc_norm\": 0.2470664928292047,\n\
\ \"acc_norm_stderr\": 0.011015752255279343\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2867647058823529,\n \"acc_stderr\": 0.02747227447323382,\n\
\ \"acc_norm\": 0.2867647058823529,\n \"acc_norm_stderr\": 0.02747227447323382\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25980392156862747,\n \"acc_stderr\": 0.017740899509177788,\n \
\ \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.017740899509177788\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2636363636363636,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.2636363636363636,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.02892058322067558,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02892058322067558\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.31343283582089554,\n\
\ \"acc_stderr\": 0.032801882053486414,\n \"acc_norm\": 0.31343283582089554,\n\
\ \"acc_norm_stderr\": 0.032801882053486414\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\
\ \"acc_stderr\": 0.0362933532994786,\n \"acc_norm\": 0.3192771084337349,\n\
\ \"acc_norm_stderr\": 0.0362933532994786\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.23976608187134502,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.23976608187134502,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22766217870257038,\n\
\ \"mc1_stderr\": 0.014679255032111066,\n \"mc2\": 0.3834241864559234,\n\
\ \"mc2_stderr\": 0.014237766301129044\n }\n}\n```"
repo_url: https://huggingface.co/facebook/opt-iml-max-1.3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|arc:challenge|25_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hellaswag|10_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T09:51:53.668877.parquet'
---
# Dataset Card for Evaluation run of facebook/opt-iml-max-1.3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/facebook/opt-iml-max-1.3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [facebook/opt-iml-max-1.3b](https://huggingface.co/facebook/opt-iml-max-1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_facebook__opt-iml-max-1.3b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-22T09:51:53.668877](https://huggingface.co/datasets/open-llm-leaderboard/details_facebook__opt-iml-max-1.3b/blob/main/results_2023-08-22T09%3A51%3A53.668877.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.27830572189653424,
"acc_stderr": 0.03226567604110693,
"acc_norm": 0.2810728038026702,
"acc_norm_stderr": 0.0322752387176325,
"mc1": 0.22766217870257038,
"mc1_stderr": 0.014679255032111066,
"mc2": 0.3834241864559234,
"mc2_stderr": 0.014237766301129044
},
"harness|arc:challenge|25": {
"acc": 0.27047781569965873,
"acc_stderr": 0.012980954547659554,
"acc_norm": 0.30716723549488056,
"acc_norm_stderr": 0.013481034054980945
},
"harness|hellaswag|10": {
"acc": 0.4115714001194981,
"acc_stderr": 0.004911125101064648,
"acc_norm": 0.5381398127862975,
"acc_norm_stderr": 0.004975243508752004
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.15,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.15,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.03749850709174022,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.03749850709174022
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.1513157894736842,
"acc_stderr": 0.029162631596843982,
"acc_norm": 0.1513157894736842,
"acc_norm_stderr": 0.029162631596843982
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.32452830188679244,
"acc_stderr": 0.028815615713432115,
"acc_norm": 0.32452830188679244,
"acc_norm_stderr": 0.028815615713432115
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.03745554791462457,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.03745554791462457
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28085106382978725,
"acc_stderr": 0.02937917046412483,
"acc_norm": 0.28085106382978725,
"acc_norm_stderr": 0.02937917046412483
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518752,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518752
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276864,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276864
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.32903225806451614,
"acc_stderr": 0.026729499068349972,
"acc_norm": 0.32903225806451614,
"acc_norm_stderr": 0.026729499068349972
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.18226600985221675,
"acc_stderr": 0.02716334085964515,
"acc_norm": 0.18226600985221675,
"acc_norm_stderr": 0.02716334085964515
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.296969696969697,
"acc_stderr": 0.03567969772268048,
"acc_norm": 0.296969696969697,
"acc_norm_stderr": 0.03567969772268048
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.02985751567338641,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.02985751567338641
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.38341968911917096,
"acc_stderr": 0.03508984236295341,
"acc_norm": 0.38341968911917096,
"acc_norm_stderr": 0.03508984236295341
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.30256410256410254,
"acc_stderr": 0.02329088805377274,
"acc_norm": 0.30256410256410254,
"acc_norm_stderr": 0.02329088805377274
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25210084033613445,
"acc_stderr": 0.028205545033277723,
"acc_norm": 0.25210084033613445,
"acc_norm_stderr": 0.028205545033277723
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3357798165137615,
"acc_stderr": 0.02024808139675293,
"acc_norm": 0.3357798165137615,
"acc_norm_stderr": 0.02024808139675293
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28921568627450983,
"acc_stderr": 0.03182231867647554,
"acc_norm": 0.28921568627450983,
"acc_norm_stderr": 0.03182231867647554
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.35864978902953587,
"acc_stderr": 0.03121956944530184,
"acc_norm": 0.35864978902953587,
"acc_norm_stderr": 0.03121956944530184
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3452914798206278,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.3452914798206278,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3435114503816794,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.3435114503816794,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.20535714285714285,
"acc_stderr": 0.03834241021419073,
"acc_norm": 0.20535714285714285,
"acc_norm_stderr": 0.03834241021419073
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.048026946982589726,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.048026946982589726
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23931623931623933,
"acc_stderr": 0.027951826808924333,
"acc_norm": 0.23931623931623933,
"acc_norm_stderr": 0.027951826808924333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.32567049808429116,
"acc_stderr": 0.016757989458549682,
"acc_norm": 0.32567049808429116,
"acc_norm_stderr": 0.016757989458549682
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.023786203255508277,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.023786203255508277
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.21675977653631284,
"acc_stderr": 0.01378059848644334,
"acc_norm": 0.21675977653631284,
"acc_norm_stderr": 0.01378059848644334
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.32679738562091504,
"acc_stderr": 0.026857294663281416,
"acc_norm": 0.32679738562091504,
"acc_norm_stderr": 0.026857294663281416
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19614147909967847,
"acc_stderr": 0.02255244778047803,
"acc_norm": 0.19614147909967847,
"acc_norm_stderr": 0.02255244778047803
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.27469135802469136,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.27469135802469136,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2801418439716312,
"acc_stderr": 0.02678917235114025,
"acc_norm": 0.2801418439716312,
"acc_norm_stderr": 0.02678917235114025
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2470664928292047,
"acc_stderr": 0.011015752255279343,
"acc_norm": 0.2470664928292047,
"acc_norm_stderr": 0.011015752255279343
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2867647058823529,
"acc_stderr": 0.02747227447323382,
"acc_norm": 0.2867647058823529,
"acc_norm_stderr": 0.02747227447323382
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.017740899509177788,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.017740899509177788
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2636363636363636,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.2636363636363636,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.02892058322067558,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.02892058322067558
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.31343283582089554,
"acc_stderr": 0.032801882053486414,
"acc_norm": 0.31343283582089554,
"acc_norm_stderr": 0.032801882053486414
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.0362933532994786,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.0362933532994786
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23976608187134502,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.23976608187134502,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22766217870257038,
"mc1_stderr": 0.014679255032111066,
"mc2": 0.3834241864559234,
"mc2_stderr": 0.014237766301129044
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_OpenAssistant__llama2-70b-oasst-sft-v10 | 2023-08-27T12:44:50.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of OpenAssistant/llama2-70b-oasst-sft-v10
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenAssistant/llama2-70b-oasst-sft-v10](https://huggingface.co/OpenAssistant/llama2-70b-oasst-sft-v10)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenAssistant__llama2-70b-oasst-sft-v10\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-25T09:31:41.529472](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenAssistant__llama2-70b-oasst-sft-v10/blob/main/results_2023-08-25T09%3A31%3A41.529472.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6762297160317442,\n\
\ \"acc_stderr\": 0.03170426091732023,\n \"acc_norm\": 0.6800806503392718,\n\
\ \"acc_norm_stderr\": 0.031677468446975464,\n \"mc1\": 0.3880048959608323,\n\
\ \"mc1_stderr\": 0.01705876150134797,\n \"mc2\": 0.5644519054208545,\n\
\ \"mc2_stderr\": 0.01489064456320429\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6382252559726962,\n \"acc_stderr\": 0.014041957945038076,\n\
\ \"acc_norm\": 0.6706484641638225,\n \"acc_norm_stderr\": 0.013734057652635474\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.668990240987851,\n\
\ \"acc_stderr\": 0.004696148339570979,\n \"acc_norm\": 0.8637721569408484,\n\
\ \"acc_norm_stderr\": 0.003423292881632149\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7763157894736842,\n \"acc_stderr\": 0.033911609343436046,\n\
\ \"acc_norm\": 0.7763157894736842,\n \"acc_norm_stderr\": 0.033911609343436046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544074,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544074\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6170212765957447,\n \"acc_stderr\": 0.031778212502369216,\n\
\ \"acc_norm\": 0.6170212765957447,\n \"acc_norm_stderr\": 0.031778212502369216\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43915343915343913,\n \"acc_stderr\": 0.02555992055053101,\n \"\
acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.02555992055053101\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8258064516129032,\n\
\ \"acc_stderr\": 0.021576248184514583,\n \"acc_norm\": 0.8258064516129032,\n\
\ \"acc_norm_stderr\": 0.021576248184514583\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.02845038880528435,\n\
\ \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.02845038880528435\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8585858585858586,\n \"acc_stderr\": 0.02482590979334333,\n \"\
acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.02482590979334333\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7128205128205128,\n \"acc_stderr\": 0.022939925418530613,\n\
\ \"acc_norm\": 0.7128205128205128,\n \"acc_norm_stderr\": 0.022939925418530613\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948496,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.02720537153827947,\n \
\ \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.02720537153827947\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4304635761589404,\n \"acc_stderr\": 0.04042809961395634,\n \"\
acc_norm\": 0.4304635761589404,\n \"acc_norm_stderr\": 0.04042809961395634\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8715596330275229,\n \"acc_stderr\": 0.014344977542914318,\n \"\
acc_norm\": 0.8715596330275229,\n \"acc_norm_stderr\": 0.014344977542914318\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5925925925925926,\n \"acc_stderr\": 0.03350991604696043,\n \"\
acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.03350991604696043\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8921568627450981,\n \"acc_stderr\": 0.021770522281368398,\n \"\
acc_norm\": 0.8921568627450981,\n \"acc_norm_stderr\": 0.021770522281368398\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8649789029535865,\n \"acc_stderr\": 0.022245776632003694,\n \
\ \"acc_norm\": 0.8649789029535865,\n \"acc_norm_stderr\": 0.022245776632003694\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7668161434977578,\n\
\ \"acc_stderr\": 0.028380391147094702,\n \"acc_norm\": 0.7668161434977578,\n\
\ \"acc_norm_stderr\": 0.028380391147094702\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n\
\ \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540616,\n \"\
acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540616\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822583,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822583\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8569604086845466,\n\
\ \"acc_stderr\": 0.012520023176796506,\n \"acc_norm\": 0.8569604086845466,\n\
\ \"acc_norm_stderr\": 0.012520023176796506\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.023357365785874037,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.023357365785874037\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4960893854748603,\n\
\ \"acc_stderr\": 0.016721990073156657,\n \"acc_norm\": 0.4960893854748603,\n\
\ \"acc_norm_stderr\": 0.016721990073156657\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7459807073954984,\n\
\ \"acc_stderr\": 0.024723861504771696,\n \"acc_norm\": 0.7459807073954984,\n\
\ \"acc_norm_stderr\": 0.024723861504771696\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5283687943262412,\n \"acc_stderr\": 0.029779450957303055,\n \
\ \"acc_norm\": 0.5283687943262412,\n \"acc_norm_stderr\": 0.029779450957303055\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5371577574967406,\n\
\ \"acc_stderr\": 0.012734923579532055,\n \"acc_norm\": 0.5371577574967406,\n\
\ \"acc_norm_stderr\": 0.012734923579532055\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7091503267973857,\n \"acc_stderr\": 0.018373116915903973,\n \
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.018373116915903973\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399683,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399683\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n\
\ \"acc_stderr\": 0.021166216304659397,\n \"acc_norm\": 0.900497512437811,\n\
\ \"acc_norm_stderr\": 0.021166216304659397\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3880048959608323,\n\
\ \"mc1_stderr\": 0.01705876150134797,\n \"mc2\": 0.5644519054208545,\n\
\ \"mc2_stderr\": 0.01489064456320429\n }\n}\n```"
repo_url: https://huggingface.co/OpenAssistant/llama2-70b-oasst-sft-v10
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|arc:challenge|25_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|arc:challenge|25_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hellaswag|10_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hellaswag|10_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T02:39:07.386302.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T09:31:41.529472.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T09:31:41.529472.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_24T02_39_07.386302
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T02:39:07.386302.parquet'
- split: 2023_08_25T09_31_41.529472
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T09:31:41.529472.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T09:31:41.529472.parquet'
---
# Dataset Card for Evaluation run of OpenAssistant/llama2-70b-oasst-sft-v10
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenAssistant/llama2-70b-oasst-sft-v10
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenAssistant/llama2-70b-oasst-sft-v10](https://huggingface.co/OpenAssistant/llama2-70b-oasst-sft-v10) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenAssistant__llama2-70b-oasst-sft-v10",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-25T09:31:41.529472](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenAssistant__llama2-70b-oasst-sft-v10/blob/main/results_2023-08-25T09%3A31%3A41.529472.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6762297160317442,
"acc_stderr": 0.03170426091732023,
"acc_norm": 0.6800806503392718,
"acc_norm_stderr": 0.031677468446975464,
"mc1": 0.3880048959608323,
"mc1_stderr": 0.01705876150134797,
"mc2": 0.5644519054208545,
"mc2_stderr": 0.01489064456320429
},
"harness|arc:challenge|25": {
"acc": 0.6382252559726962,
"acc_stderr": 0.014041957945038076,
"acc_norm": 0.6706484641638225,
"acc_norm_stderr": 0.013734057652635474
},
"harness|hellaswag|10": {
"acc": 0.668990240987851,
"acc_stderr": 0.004696148339570979,
"acc_norm": 0.8637721569408484,
"acc_norm_stderr": 0.003423292881632149
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7763157894736842,
"acc_stderr": 0.033911609343436046,
"acc_norm": 0.7763157894736842,
"acc_norm_stderr": 0.033911609343436046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544074,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544074
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6170212765957447,
"acc_stderr": 0.031778212502369216,
"acc_norm": 0.6170212765957447,
"acc_norm_stderr": 0.031778212502369216
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.02555992055053101,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.02555992055053101
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8258064516129032,
"acc_stderr": 0.021576248184514583,
"acc_norm": 0.8258064516129032,
"acc_norm_stderr": 0.021576248184514583
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.02845038880528435,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.02845038880528435
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8585858585858586,
"acc_stderr": 0.02482590979334333,
"acc_norm": 0.8585858585858586,
"acc_norm_stderr": 0.02482590979334333
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7128205128205128,
"acc_stderr": 0.022939925418530613,
"acc_norm": 0.7128205128205128,
"acc_norm_stderr": 0.022939925418530613
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948496,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.02720537153827947,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.02720537153827947
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4304635761589404,
"acc_stderr": 0.04042809961395634,
"acc_norm": 0.4304635761589404,
"acc_norm_stderr": 0.04042809961395634
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8715596330275229,
"acc_stderr": 0.014344977542914318,
"acc_norm": 0.8715596330275229,
"acc_norm_stderr": 0.014344977542914318
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.03350991604696043,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.03350991604696043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8921568627450981,
"acc_stderr": 0.021770522281368398,
"acc_norm": 0.8921568627450981,
"acc_norm_stderr": 0.021770522281368398
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8649789029535865,
"acc_stderr": 0.022245776632003694,
"acc_norm": 0.8649789029535865,
"acc_norm_stderr": 0.022245776632003694
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7668161434977578,
"acc_stderr": 0.028380391147094702,
"acc_norm": 0.7668161434977578,
"acc_norm_stderr": 0.028380391147094702
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540616,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540616
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822583,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822583
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8569604086845466,
"acc_stderr": 0.012520023176796506,
"acc_norm": 0.8569604086845466,
"acc_norm_stderr": 0.012520023176796506
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.023357365785874037,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.023357365785874037
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4960893854748603,
"acc_stderr": 0.016721990073156657,
"acc_norm": 0.4960893854748603,
"acc_norm_stderr": 0.016721990073156657
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7459807073954984,
"acc_stderr": 0.024723861504771696,
"acc_norm": 0.7459807073954984,
"acc_norm_stderr": 0.024723861504771696
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5283687943262412,
"acc_stderr": 0.029779450957303055,
"acc_norm": 0.5283687943262412,
"acc_norm_stderr": 0.029779450957303055
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5371577574967406,
"acc_stderr": 0.012734923579532055,
"acc_norm": 0.5371577574967406,
"acc_norm_stderr": 0.012734923579532055
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.018373116915903973,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.018373116915903973
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399683,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399683
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659397,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659397
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3880048959608323,
"mc1_stderr": 0.01705876150134797,
"mc2": 0.5644519054208545,
"mc2_stderr": 0.01489064456320429
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.4-peft | 2023-08-27T12:44:52.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of jondurbin/airoboros-65b-gpt4-1.4-peft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-65b-gpt4-1.4-peft](https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.4-peft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.4-peft\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T22:39:55.559103](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.4-peft/blob/main/results_2023-08-18T22%3A39%3A55.559103.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6238829279110752,\n\
\ \"acc_stderr\": 0.033092611182654476,\n \"acc_norm\": 0.6273326952658104,\n\
\ \"acc_norm_stderr\": 0.03306899876956139,\n \"mc1\": 0.37209302325581395,\n\
\ \"mc1_stderr\": 0.016921090118814038,\n \"mc2\": 0.5245475444019604,\n\
\ \"mc2_stderr\": 0.015335485182595166\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6339590443686007,\n \"acc_stderr\": 0.014077223108470137,\n\
\ \"acc_norm\": 0.6578498293515358,\n \"acc_norm_stderr\": 0.013864152159177278\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6786496713802032,\n\
\ \"acc_stderr\": 0.004660405565338759,\n \"acc_norm\": 0.8582951603266281,\n\
\ \"acc_norm_stderr\": 0.003480344142139514\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952928,\n\
\ \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952928\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n\
\ \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.03800968060554859,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.03800968060554859\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.038047497443647646,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.038047497443647646\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006716,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006716\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715563,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715563\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3544973544973545,\n \"acc_stderr\": 0.024636830602841997,\n \"\
acc_norm\": 0.3544973544973545,\n \"acc_norm_stderr\": 0.024636830602841997\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949097,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949097\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7161290322580646,\n \"acc_stderr\": 0.025649381063029254,\n \"\
acc_norm\": 0.7161290322580646,\n \"acc_norm_stderr\": 0.025649381063029254\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n \"\
acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878948,\n\
\ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878948\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.031499305777849054,\n\
\ \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.031499305777849054\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4105960264900662,\n \"acc_stderr\": 0.04016689594849926,\n \"\
acc_norm\": 0.4105960264900662,\n \"acc_norm_stderr\": 0.04016689594849926\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.818348623853211,\n \"acc_stderr\": 0.016530617409266857,\n \"\
acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266857\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931055,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931055\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632443,\n \
\ \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632443\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.0306365913486998,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.0306365913486998\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.0225090339370778,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.0225090339370778\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917202,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917202\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46368715083798884,\n\
\ \"acc_stderr\": 0.01667834189453317,\n \"acc_norm\": 0.46368715083798884,\n\
\ \"acc_norm_stderr\": 0.01667834189453317\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.026716118380156847,\n\
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.026716118380156847\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7363344051446945,\n\
\ \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.7363344051446945,\n\
\ \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \"\
acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47783572359843546,\n\
\ \"acc_stderr\": 0.01275768304771618,\n \"acc_norm\": 0.47783572359843546,\n\
\ \"acc_norm_stderr\": 0.01275768304771618\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.029896163033125474,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.029896163033125474\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6797385620915033,\n \"acc_stderr\": 0.01887568293806945,\n \
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.01887568293806945\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.029504896454595964,\n\
\ \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.029504896454595964\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37209302325581395,\n\
\ \"mc1_stderr\": 0.016921090118814038,\n \"mc2\": 0.5245475444019604,\n\
\ \"mc2_stderr\": 0.015335485182595166\n }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.4-peft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|arc:challenge|25_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hellaswag|10_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T22:39:55.559103.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T22:39:55.559103.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T22_39_55.559103
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T22:39:55.559103.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T22:39:55.559103.parquet'
---
# Dataset Card for Evaluation run of jondurbin/airoboros-65b-gpt4-1.4-peft
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.4-peft
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-65b-gpt4-1.4-peft](https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.4-peft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.4-peft",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T22:39:55.559103](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.4-peft/blob/main/results_2023-08-18T22%3A39%3A55.559103.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6238829279110752,
"acc_stderr": 0.033092611182654476,
"acc_norm": 0.6273326952658104,
"acc_norm_stderr": 0.03306899876956139,
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814038,
"mc2": 0.5245475444019604,
"mc2_stderr": 0.015335485182595166
},
"harness|arc:challenge|25": {
"acc": 0.6339590443686007,
"acc_stderr": 0.014077223108470137,
"acc_norm": 0.6578498293515358,
"acc_norm_stderr": 0.013864152159177278
},
"harness|hellaswag|10": {
"acc": 0.6786496713802032,
"acc_stderr": 0.004660405565338759,
"acc_norm": 0.8582951603266281,
"acc_norm_stderr": 0.003480344142139514
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952928,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.03800968060554859,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.03800968060554859
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.038047497443647646,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.038047497443647646
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006716,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006716
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715563,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3544973544973545,
"acc_stderr": 0.024636830602841997,
"acc_norm": 0.3544973544973545,
"acc_norm_stderr": 0.024636830602841997
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949097,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949097
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7161290322580646,
"acc_stderr": 0.025649381063029254,
"acc_norm": 0.7161290322580646,
"acc_norm_stderr": 0.025649381063029254
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.024697216930878948,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.024697216930878948
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.031499305777849054,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.031499305777849054
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4105960264900662,
"acc_stderr": 0.04016689594849926,
"acc_norm": 0.4105960264900662,
"acc_norm_stderr": 0.04016689594849926
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266857,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266857
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931055,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931055
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632443,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.0306365913486998,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.0306365913486998
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.0225090339370778,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.0225090339370778
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973136,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917202,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917202
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.46368715083798884,
"acc_stderr": 0.01667834189453317,
"acc_norm": 0.46368715083798884,
"acc_norm_stderr": 0.01667834189453317
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.026716118380156847,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.026716118380156847
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7363344051446945,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.7363344051446945,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47783572359843546,
"acc_stderr": 0.01275768304771618,
"acc_norm": 0.47783572359843546,
"acc_norm_stderr": 0.01275768304771618
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.029896163033125474,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.029896163033125474
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.01887568293806945,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.01887568293806945
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.029504896454595964,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.029504896454595964
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814038,
"mc2": 0.5245475444019604,
"mc2_stderr": 0.015335485182595166
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TheBloke__WizardLM-30B-GPTQ | 2023-08-27T12:44:54.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/WizardLM-30B-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/WizardLM-30B-GPTQ](https://huggingface.co/TheBloke/WizardLM-30B-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__WizardLM-30B-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-22T13:58:45.500746](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__WizardLM-30B-GPTQ/blob/main/results_2023-08-22T13%3A58%3A45.500746.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24599177907991926,\n\
\ \"acc_stderr\": 0.031415523509032525,\n \"acc_norm\": 0.2471446509072939,\n\
\ \"acc_norm_stderr\": 0.03143344614230005,\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.4913727694395835,\n\
\ \"mc2_stderr\": 0.016939600973226592\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.22525597269624573,\n \"acc_stderr\": 0.012207839995407326,\n\
\ \"acc_norm\": 0.2883959044368601,\n \"acc_norm_stderr\": 0.013238394422428173\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2559251145190201,\n\
\ \"acc_stderr\": 0.0043548810057897295,\n \"acc_norm\": 0.2608046205935073,\n\
\ \"acc_norm_stderr\": 0.004381761941552688\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.03749850709174022,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.03749850709174022\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.27631578947368424,\n \"acc_stderr\": 0.03639057569952924,\n\
\ \"acc_norm\": 0.27631578947368424,\n \"acc_norm_stderr\": 0.03639057569952924\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n\
\ \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.20754716981132076,\n \"acc_stderr\": 0.02495991802891127,\n\
\ \"acc_norm\": 0.20754716981132076,\n \"acc_norm_stderr\": 0.02495991802891127\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2847222222222222,\n\
\ \"acc_stderr\": 0.03773809990686934,\n \"acc_norm\": 0.2847222222222222,\n\
\ \"acc_norm_stderr\": 0.03773809990686934\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2658959537572254,\n\
\ \"acc_stderr\": 0.033687629322594295,\n \"acc_norm\": 0.2658959537572254,\n\
\ \"acc_norm_stderr\": 0.033687629322594295\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.02865917937429232,\n\
\ \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.02865917937429232\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489364,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489364\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.31724137931034485,\n \"acc_stderr\": 0.038783523721386215,\n\
\ \"acc_norm\": 0.31724137931034485,\n \"acc_norm_stderr\": 0.038783523721386215\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776564,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776564\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.18253968253968253,\n\
\ \"acc_stderr\": 0.03455071019102147,\n \"acc_norm\": 0.18253968253968253,\n\
\ \"acc_norm_stderr\": 0.03455071019102147\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22258064516129034,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.22258064516129034,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.17733990147783252,\n \"acc_stderr\": 0.026874337276808356,\n\
\ \"acc_norm\": 0.17733990147783252,\n \"acc_norm_stderr\": 0.026874337276808356\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25252525252525254,\n \"acc_stderr\": 0.030954055470365914,\n \"\
acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.030954055470365914\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.15544041450777202,\n \"acc_stderr\": 0.026148483469153303,\n\
\ \"acc_norm\": 0.15544041450777202,\n \"acc_norm_stderr\": 0.026148483469153303\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20512820512820512,\n \"acc_stderr\": 0.020473233173552003,\n\
\ \"acc_norm\": 0.20512820512820512,\n \"acc_norm_stderr\": 0.020473233173552003\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22962962962962963,\n \"acc_stderr\": 0.025644108639267634,\n \
\ \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.025644108639267634\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.20168067226890757,\n \"acc_stderr\": 0.02606431340630452,\n\
\ \"acc_norm\": 0.20168067226890757,\n \"acc_norm_stderr\": 0.02606431340630452\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473838,\n \"\
acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473838\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22935779816513763,\n \"acc_stderr\": 0.018025349724618684,\n \"\
acc_norm\": 0.22935779816513763,\n \"acc_norm_stderr\": 0.018025349724618684\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2638888888888889,\n \"acc_stderr\": 0.03005820270430985,\n \"\
acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03005820270430985\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.22549019607843138,\n \"acc_stderr\": 0.02933116229425174,\n \"\
acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.02933116229425174\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658342,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658342\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.28699551569506726,\n\
\ \"acc_stderr\": 0.030360379710291954,\n \"acc_norm\": 0.28699551569506726,\n\
\ \"acc_norm_stderr\": 0.030360379710291954\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591206,\n \"\
acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591206\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2692307692307692,\n\
\ \"acc_stderr\": 0.029058588303748842,\n \"acc_norm\": 0.2692307692307692,\n\
\ \"acc_norm_stderr\": 0.029058588303748842\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26181353767560667,\n\
\ \"acc_stderr\": 0.01572083867844526,\n \"acc_norm\": 0.26181353767560667,\n\
\ \"acc_norm_stderr\": 0.01572083867844526\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2658959537572254,\n \"acc_stderr\": 0.023786203255508297,\n\
\ \"acc_norm\": 0.2658959537572254,\n \"acc_norm_stderr\": 0.023786203255508297\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2324022346368715,\n\
\ \"acc_stderr\": 0.014125968754673398,\n \"acc_norm\": 0.2324022346368715,\n\
\ \"acc_norm_stderr\": 0.014125968754673398\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.23202614379084968,\n \"acc_stderr\": 0.024170840879341016,\n\
\ \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.024170840879341016\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24437299035369775,\n\
\ \"acc_stderr\": 0.024406162094668903,\n \"acc_norm\": 0.24437299035369775,\n\
\ \"acc_norm_stderr\": 0.024406162094668903\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22839506172839505,\n \"acc_stderr\": 0.023358211840626267,\n\
\ \"acc_norm\": 0.22839506172839505,\n \"acc_norm_stderr\": 0.023358211840626267\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880585,\n \
\ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880585\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2666232073011734,\n\
\ \"acc_stderr\": 0.011293836031612142,\n \"acc_norm\": 0.2666232073011734,\n\
\ \"acc_norm_stderr\": 0.011293836031612142\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2426470588235294,\n \"acc_stderr\": 0.026040662474201264,\n\
\ \"acc_norm\": 0.2426470588235294,\n \"acc_norm_stderr\": 0.026040662474201264\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24183006535947713,\n \"acc_stderr\": 0.017322789207784326,\n \
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.017322789207784326\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n\
\ \"acc_stderr\": 0.04122066502878285,\n \"acc_norm\": 0.24545454545454545,\n\
\ \"acc_norm_stderr\": 0.04122066502878285\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19183673469387755,\n \"acc_stderr\": 0.02520696315422538,\n\
\ \"acc_norm\": 0.19183673469387755,\n \"acc_norm_stderr\": 0.02520696315422538\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3034825870646766,\n\
\ \"acc_stderr\": 0.03251006816458618,\n \"acc_norm\": 0.3034825870646766,\n\
\ \"acc_norm_stderr\": 0.03251006816458618\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.20481927710843373,\n\
\ \"acc_stderr\": 0.03141784291663926,\n \"acc_norm\": 0.20481927710843373,\n\
\ \"acc_norm_stderr\": 0.03141784291663926\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.034678266857038266,\n\
\ \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.034678266857038266\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.4913727694395835,\n\
\ \"mc2_stderr\": 0.016939600973226592\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/WizardLM-30B-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|arc:challenge|25_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hellaswag|10_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:58:45.500746.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:58:45.500746.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T13_58_45.500746
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T13:58:45.500746.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T13:58:45.500746.parquet'
---
# Dataset Card for Evaluation run of TheBloke/WizardLM-30B-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/WizardLM-30B-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/WizardLM-30B-GPTQ](https://huggingface.co/TheBloke/WizardLM-30B-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__WizardLM-30B-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-22T13:58:45.500746](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__WizardLM-30B-GPTQ/blob/main/results_2023-08-22T13%3A58%3A45.500746.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24599177907991926,
"acc_stderr": 0.031415523509032525,
"acc_norm": 0.2471446509072939,
"acc_norm_stderr": 0.03143344614230005,
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.4913727694395835,
"mc2_stderr": 0.016939600973226592
},
"harness|arc:challenge|25": {
"acc": 0.22525597269624573,
"acc_stderr": 0.012207839995407326,
"acc_norm": 0.2883959044368601,
"acc_norm_stderr": 0.013238394422428173
},
"harness|hellaswag|10": {
"acc": 0.2559251145190201,
"acc_stderr": 0.0043548810057897295,
"acc_norm": 0.2608046205935073,
"acc_norm_stderr": 0.004381761941552688
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.03749850709174022,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.03749850709174022
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.27631578947368424,
"acc_stderr": 0.03639057569952924,
"acc_norm": 0.27631578947368424,
"acc_norm_stderr": 0.03639057569952924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.20754716981132076,
"acc_stderr": 0.02495991802891127,
"acc_norm": 0.20754716981132076,
"acc_norm_stderr": 0.02495991802891127
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2847222222222222,
"acc_stderr": 0.03773809990686934,
"acc_norm": 0.2847222222222222,
"acc_norm_stderr": 0.03773809990686934
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.033687629322594295,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.033687629322594295
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.25957446808510637,
"acc_stderr": 0.02865917937429232,
"acc_norm": 0.25957446808510637,
"acc_norm_stderr": 0.02865917937429232
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489364,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489364
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.31724137931034485,
"acc_stderr": 0.038783523721386215,
"acc_norm": 0.31724137931034485,
"acc_norm_stderr": 0.038783523721386215
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.022789673145776564,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.022789673145776564
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.18253968253968253,
"acc_stderr": 0.03455071019102147,
"acc_norm": 0.18253968253968253,
"acc_norm_stderr": 0.03455071019102147
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22258064516129034,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.22258064516129034,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.17733990147783252,
"acc_stderr": 0.026874337276808356,
"acc_norm": 0.17733990147783252,
"acc_norm_stderr": 0.026874337276808356
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25252525252525254,
"acc_stderr": 0.030954055470365914,
"acc_norm": 0.25252525252525254,
"acc_norm_stderr": 0.030954055470365914
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.15544041450777202,
"acc_stderr": 0.026148483469153303,
"acc_norm": 0.15544041450777202,
"acc_norm_stderr": 0.026148483469153303
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20512820512820512,
"acc_stderr": 0.020473233173552003,
"acc_norm": 0.20512820512820512,
"acc_norm_stderr": 0.020473233173552003
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.025644108639267634,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.025644108639267634
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.20168067226890757,
"acc_stderr": 0.02606431340630452,
"acc_norm": 0.20168067226890757,
"acc_norm_stderr": 0.02606431340630452
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473838,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473838
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22935779816513763,
"acc_stderr": 0.018025349724618684,
"acc_norm": 0.22935779816513763,
"acc_norm_stderr": 0.018025349724618684
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03005820270430985,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03005820270430985
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.28699551569506726,
"acc_stderr": 0.030360379710291954,
"acc_norm": 0.28699551569506726,
"acc_norm_stderr": 0.030360379710291954
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.04026187527591206,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.04026187527591206
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.21359223300970873,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.21359223300970873,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2692307692307692,
"acc_stderr": 0.029058588303748842,
"acc_norm": 0.2692307692307692,
"acc_norm_stderr": 0.029058588303748842
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26181353767560667,
"acc_stderr": 0.01572083867844526,
"acc_norm": 0.26181353767560667,
"acc_norm_stderr": 0.01572083867844526
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.023786203255508297,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.023786203255508297
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2324022346368715,
"acc_stderr": 0.014125968754673398,
"acc_norm": 0.2324022346368715,
"acc_norm_stderr": 0.014125968754673398
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.024170840879341016,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.024170840879341016
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24437299035369775,
"acc_stderr": 0.024406162094668903,
"acc_norm": 0.24437299035369775,
"acc_norm_stderr": 0.024406162094668903
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22839506172839505,
"acc_stderr": 0.023358211840626267,
"acc_norm": 0.22839506172839505,
"acc_norm_stderr": 0.023358211840626267
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.026358065698880585,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.026358065698880585
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2666232073011734,
"acc_stderr": 0.011293836031612142,
"acc_norm": 0.2666232073011734,
"acc_norm_stderr": 0.011293836031612142
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2426470588235294,
"acc_stderr": 0.026040662474201264,
"acc_norm": 0.2426470588235294,
"acc_norm_stderr": 0.026040662474201264
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.017322789207784326,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.017322789207784326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.04122066502878285,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.04122066502878285
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19183673469387755,
"acc_stderr": 0.02520696315422538,
"acc_norm": 0.19183673469387755,
"acc_norm_stderr": 0.02520696315422538
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3034825870646766,
"acc_stderr": 0.03251006816458618,
"acc_norm": 0.3034825870646766,
"acc_norm_stderr": 0.03251006816458618
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.20481927710843373,
"acc_stderr": 0.03141784291663926,
"acc_norm": 0.20481927710843373,
"acc_norm_stderr": 0.03141784291663926
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.034678266857038266,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.034678266857038266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.4913727694395835,
"mc2_stderr": 0.016939600973226592
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TheBloke__stable-vicuna-13B-HF | 2023-08-27T12:44:56.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/stable-vicuna-13B-HF
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/stable-vicuna-13B-HF](https://huggingface.co/TheBloke/stable-vicuna-13B-HF)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__stable-vicuna-13B-HF\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-22T17:12:46.134347](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__stable-vicuna-13B-HF/blob/main/results_2023-08-22T17%3A12%3A46.134347.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5045929860684458,\n\
\ \"acc_stderr\": 0.03513348936167607,\n \"acc_norm\": 0.5082332848277713,\n\
\ \"acc_norm_stderr\": 0.03511927358432602,\n \"mc1\": 0.3427172582619339,\n\
\ \"mc1_stderr\": 0.01661494938534704,\n \"mc2\": 0.4838353616511973,\n\
\ \"mc2_stderr\": 0.015030079987453928\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5170648464163823,\n \"acc_stderr\": 0.014602878388536597,\n\
\ \"acc_norm\": 0.5332764505119454,\n \"acc_norm_stderr\": 0.014578995859605806\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5864369647480582,\n\
\ \"acc_stderr\": 0.004914655063329499,\n \"acc_norm\": 0.7850029874526987,\n\
\ \"acc_norm_stderr\": 0.004099806728607399\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490436,\n\
\ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490436\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.46037735849056605,\n \"acc_stderr\": 0.030676096599389188,\n\
\ \"acc_norm\": 0.46037735849056605,\n \"acc_norm_stderr\": 0.030676096599389188\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.04174752578923183,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.04174752578923183\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3815028901734104,\n\
\ \"acc_stderr\": 0.037038511930995194,\n \"acc_norm\": 0.3815028901734104,\n\
\ \"acc_norm_stderr\": 0.037038511930995194\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.023517294335963283,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.023517294335963283\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5580645161290323,\n \"acc_stderr\": 0.028251557906849734,\n \"\
acc_norm\": 0.5580645161290323,\n \"acc_norm_stderr\": 0.028251557906849734\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.35467980295566504,\n \"acc_stderr\": 0.03366124489051449,\n \"\
acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.03366124489051449\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.03756335775187896,\n\
\ \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03756335775187896\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6262626262626263,\n \"acc_stderr\": 0.03446897738659333,\n \"\
acc_norm\": 0.6262626262626263,\n \"acc_norm_stderr\": 0.03446897738659333\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.694300518134715,\n \"acc_stderr\": 0.033248379397581594,\n\
\ \"acc_norm\": 0.694300518134715,\n \"acc_norm_stderr\": 0.033248379397581594\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4512820512820513,\n \"acc_stderr\": 0.02523038123893484,\n \
\ \"acc_norm\": 0.4512820512820513,\n \"acc_norm_stderr\": 0.02523038123893484\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5042016806722689,\n \"acc_stderr\": 0.0324773433444811,\n\
\ \"acc_norm\": 0.5042016806722689,\n \"acc_norm_stderr\": 0.0324773433444811\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6642201834862386,\n\
\ \"acc_stderr\": 0.020248081396752923,\n \"acc_norm\": 0.6642201834862386,\n\
\ \"acc_norm_stderr\": 0.020248081396752923\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n\
\ \"acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6323529411764706,\n \"acc_stderr\": 0.03384132045674119,\n \"\
acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.03384132045674119\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6497890295358649,\n \"acc_stderr\": 0.031052391937584346,\n \
\ \"acc_norm\": 0.6497890295358649,\n \"acc_norm_stderr\": 0.031052391937584346\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5874439461883408,\n\
\ \"acc_stderr\": 0.03304062175449297,\n \"acc_norm\": 0.5874439461883408,\n\
\ \"acc_norm_stderr\": 0.03304062175449297\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969638,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969638\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968432,\n \"\
acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968432\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6018518518518519,\n\
\ \"acc_stderr\": 0.04732332615978814,\n \"acc_norm\": 0.6018518518518519,\n\
\ \"acc_norm_stderr\": 0.04732332615978814\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.03746668325470021,\n\
\ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.03746668325470021\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.046897659372781335,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.046897659372781335\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.027236013946196687,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.027236013946196687\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6641123882503193,\n\
\ \"acc_stderr\": 0.016889407235171686,\n \"acc_norm\": 0.6641123882503193,\n\
\ \"acc_norm_stderr\": 0.016889407235171686\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5549132947976878,\n \"acc_stderr\": 0.026756255129663762,\n\
\ \"acc_norm\": 0.5549132947976878,\n \"acc_norm_stderr\": 0.026756255129663762\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2748603351955307,\n\
\ \"acc_stderr\": 0.014931316703220504,\n \"acc_norm\": 0.2748603351955307,\n\
\ \"acc_norm_stderr\": 0.014931316703220504\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5261437908496732,\n \"acc_stderr\": 0.028590752958852387,\n\
\ \"acc_norm\": 0.5261437908496732,\n \"acc_norm_stderr\": 0.028590752958852387\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5434083601286174,\n\
\ \"acc_stderr\": 0.028290869054197604,\n \"acc_norm\": 0.5434083601286174,\n\
\ \"acc_norm_stderr\": 0.028290869054197604\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5216049382716049,\n \"acc_stderr\": 0.027794760105008736,\n\
\ \"acc_norm\": 0.5216049382716049,\n \"acc_norm_stderr\": 0.027794760105008736\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3829787234042553,\n \"acc_stderr\": 0.02899908090480619,\n \
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.02899908090480619\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4048239895697523,\n\
\ \"acc_stderr\": 0.012536743830953987,\n \"acc_norm\": 0.4048239895697523,\n\
\ \"acc_norm_stderr\": 0.012536743830953987\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.030372836961539352,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.030372836961539352\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.5032679738562091,\n \"acc_stderr\": 0.020227402794434867,\n \"\
acc_norm\": 0.5032679738562091,\n \"acc_norm_stderr\": 0.020227402794434867\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6163265306122448,\n \"acc_stderr\": 0.03113088039623593,\n\
\ \"acc_norm\": 0.6163265306122448,\n \"acc_norm_stderr\": 0.03113088039623593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6517412935323383,\n\
\ \"acc_stderr\": 0.033687874661154596,\n \"acc_norm\": 0.6517412935323383,\n\
\ \"acc_norm_stderr\": 0.033687874661154596\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n\
\ \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3427172582619339,\n\
\ \"mc1_stderr\": 0.01661494938534704,\n \"mc2\": 0.4838353616511973,\n\
\ \"mc2_stderr\": 0.015030079987453928\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/stable-vicuna-13B-HF
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|arc:challenge|25_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hellaswag|10_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T17:12:46.134347.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T17:12:46.134347.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T17_12_46.134347
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T17:12:46.134347.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T17:12:46.134347.parquet'
---
# Dataset Card for Evaluation run of TheBloke/stable-vicuna-13B-HF
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/stable-vicuna-13B-HF
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/stable-vicuna-13B-HF](https://huggingface.co/TheBloke/stable-vicuna-13B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__stable-vicuna-13B-HF",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-22T17:12:46.134347](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__stable-vicuna-13B-HF/blob/main/results_2023-08-22T17%3A12%3A46.134347.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5045929860684458,
"acc_stderr": 0.03513348936167607,
"acc_norm": 0.5082332848277713,
"acc_norm_stderr": 0.03511927358432602,
"mc1": 0.3427172582619339,
"mc1_stderr": 0.01661494938534704,
"mc2": 0.4838353616511973,
"mc2_stderr": 0.015030079987453928
},
"harness|arc:challenge|25": {
"acc": 0.5170648464163823,
"acc_stderr": 0.014602878388536597,
"acc_norm": 0.5332764505119454,
"acc_norm_stderr": 0.014578995859605806
},
"harness|hellaswag|10": {
"acc": 0.5864369647480582,
"acc_stderr": 0.004914655063329499,
"acc_norm": 0.7850029874526987,
"acc_norm_stderr": 0.004099806728607399
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.46037735849056605,
"acc_stderr": 0.030676096599389188,
"acc_norm": 0.46037735849056605,
"acc_norm_stderr": 0.030676096599389188
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04174752578923183,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04174752578923183
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3815028901734104,
"acc_stderr": 0.037038511930995194,
"acc_norm": 0.3815028901734104,
"acc_norm_stderr": 0.037038511930995194
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.023517294335963283,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.023517294335963283
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5580645161290323,
"acc_stderr": 0.028251557906849734,
"acc_norm": 0.5580645161290323,
"acc_norm_stderr": 0.028251557906849734
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.03366124489051449,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.03366124489051449
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03756335775187896,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03756335775187896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6262626262626263,
"acc_stderr": 0.03446897738659333,
"acc_norm": 0.6262626262626263,
"acc_norm_stderr": 0.03446897738659333
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.694300518134715,
"acc_stderr": 0.033248379397581594,
"acc_norm": 0.694300518134715,
"acc_norm_stderr": 0.033248379397581594
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4512820512820513,
"acc_stderr": 0.02523038123893484,
"acc_norm": 0.4512820512820513,
"acc_norm_stderr": 0.02523038123893484
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5042016806722689,
"acc_stderr": 0.0324773433444811,
"acc_norm": 0.5042016806722689,
"acc_norm_stderr": 0.0324773433444811
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6642201834862386,
"acc_stderr": 0.020248081396752923,
"acc_norm": 0.6642201834862386,
"acc_norm_stderr": 0.020248081396752923
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.03384132045674119,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.03384132045674119
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6497890295358649,
"acc_stderr": 0.031052391937584346,
"acc_norm": 0.6497890295358649,
"acc_norm_stderr": 0.031052391937584346
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5874439461883408,
"acc_stderr": 0.03304062175449297,
"acc_norm": 0.5874439461883408,
"acc_norm_stderr": 0.03304062175449297
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.04732332615978814,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.04732332615978814
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.03746668325470021,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.03746668325470021
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.046897659372781335,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.046897659372781335
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.027236013946196687,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.027236013946196687
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6641123882503193,
"acc_stderr": 0.016889407235171686,
"acc_norm": 0.6641123882503193,
"acc_norm_stderr": 0.016889407235171686
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.026756255129663762,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.026756255129663762
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2748603351955307,
"acc_stderr": 0.014931316703220504,
"acc_norm": 0.2748603351955307,
"acc_norm_stderr": 0.014931316703220504
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5261437908496732,
"acc_stderr": 0.028590752958852387,
"acc_norm": 0.5261437908496732,
"acc_norm_stderr": 0.028590752958852387
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5434083601286174,
"acc_stderr": 0.028290869054197604,
"acc_norm": 0.5434083601286174,
"acc_norm_stderr": 0.028290869054197604
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5216049382716049,
"acc_stderr": 0.027794760105008736,
"acc_norm": 0.5216049382716049,
"acc_norm_stderr": 0.027794760105008736
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.02899908090480619,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.02899908090480619
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4048239895697523,
"acc_stderr": 0.012536743830953987,
"acc_norm": 0.4048239895697523,
"acc_norm_stderr": 0.012536743830953987
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5,
"acc_stderr": 0.030372836961539352,
"acc_norm": 0.5,
"acc_norm_stderr": 0.030372836961539352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5032679738562091,
"acc_stderr": 0.020227402794434867,
"acc_norm": 0.5032679738562091,
"acc_norm_stderr": 0.020227402794434867
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6163265306122448,
"acc_stderr": 0.03113088039623593,
"acc_norm": 0.6163265306122448,
"acc_norm_stderr": 0.03113088039623593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6517412935323383,
"acc_stderr": 0.033687874661154596,
"acc_norm": 0.6517412935323383,
"acc_norm_stderr": 0.033687874661154596
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7426900584795322,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.7426900584795322,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3427172582619339,
"mc1_stderr": 0.01661494938534704,
"mc2": 0.4838353616511973,
"mc2_stderr": 0.015030079987453928
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TheBloke__Platypus-30B-SuperHOT-8K-fp16 | 2023-08-27T12:44:58.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/Platypus-30B-SuperHOT-8K-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Platypus-30B-SuperHOT-8K-fp16](https://huggingface.co/TheBloke/Platypus-30B-SuperHOT-8K-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Platypus-30B-SuperHOT-8K-fp16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T16:25:34.320244](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Platypus-30B-SuperHOT-8K-fp16/blob/main/results_2023-08-18T16%3A25%3A34.320244.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23647488823331855,\n\
\ \"acc_stderr\": 0.030908567573023033,\n \"acc_norm\": 0.23771978116158754,\n\
\ \"acc_norm_stderr\": 0.030923042741200276,\n \"mc1\": 0.2178702570379437,\n\
\ \"mc1_stderr\": 0.014450846714123892,\n \"mc2\": 0.471292004765754,\n\
\ \"mc2_stderr\": 0.01664156844910162\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.21843003412969283,\n \"acc_stderr\": 0.012074291605700987,\n\
\ \"acc_norm\": 0.2568259385665529,\n \"acc_norm_stderr\": 0.0127669237941168\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2731527584146584,\n\
\ \"acc_stderr\": 0.004446680081493746,\n \"acc_norm\": 0.3082055367456682,\n\
\ \"acc_norm_stderr\": 0.004608082815535489\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2074074074074074,\n\
\ \"acc_stderr\": 0.035025531706783186,\n \"acc_norm\": 0.2074074074074074,\n\
\ \"acc_norm_stderr\": 0.035025531706783186\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.20394736842105263,\n \"acc_stderr\": 0.032790004063100515,\n\
\ \"acc_norm\": 0.20394736842105263,\n \"acc_norm_stderr\": 0.032790004063100515\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.22641509433962265,\n \"acc_stderr\": 0.025757559893106748,\n\
\ \"acc_norm\": 0.22641509433962265,\n \"acc_norm_stderr\": 0.025757559893106748\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.02865917937429232,\n\
\ \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.02865917937429232\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436695,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436695\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.036001056927277716,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.036001056927277716\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.040061680838488746,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.040061680838488746\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25483870967741934,\n\
\ \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.25483870967741934,\n\
\ \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.18226600985221675,\n \"acc_stderr\": 0.02716334085964515,\n\
\ \"acc_norm\": 0.18226600985221675,\n \"acc_norm_stderr\": 0.02716334085964515\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.18686868686868688,\n \"acc_stderr\": 0.02777253333421898,\n \"\
acc_norm\": 0.18686868686868688,\n \"acc_norm_stderr\": 0.02777253333421898\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21243523316062177,\n \"acc_stderr\": 0.02951928261681723,\n\
\ \"acc_norm\": 0.21243523316062177,\n \"acc_norm_stderr\": 0.02951928261681723\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23333333333333334,\n \"acc_stderr\": 0.021444547301560476,\n\
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.021444547301560476\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2037037037037037,\n \"acc_stderr\": 0.024556172219141265,\n \
\ \"acc_norm\": 0.2037037037037037,\n \"acc_norm_stderr\": 0.024556172219141265\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.026653531596715494,\n\
\ \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.026653531596715494\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.17218543046357615,\n \"acc_stderr\": 0.030826136961962396,\n \"\
acc_norm\": 0.17218543046357615,\n \"acc_norm_stderr\": 0.030826136961962396\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1981651376146789,\n \"acc_stderr\": 0.017090573804217885,\n \"\
acc_norm\": 0.1981651376146789,\n \"acc_norm_stderr\": 0.017090573804217885\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2037037037037037,\n \"acc_stderr\": 0.02746740180405799,\n \"\
acc_norm\": 0.2037037037037037,\n \"acc_norm_stderr\": 0.02746740180405799\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.28270042194092826,\n \"acc_stderr\": 0.02931281415395592,\n\
\ \"acc_norm\": 0.28270042194092826,\n \"acc_norm_stderr\": 0.02931281415395592\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3094170403587444,\n\
\ \"acc_stderr\": 0.031024411740572203,\n \"acc_norm\": 0.3094170403587444,\n\
\ \"acc_norm_stderr\": 0.031024411740572203\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.29770992366412213,\n \"acc_stderr\": 0.04010358942462203,\n\
\ \"acc_norm\": 0.29770992366412213,\n \"acc_norm_stderr\": 0.04010358942462203\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2515964240102171,\n\
\ \"acc_stderr\": 0.015517322365529619,\n \"acc_norm\": 0.2515964240102171,\n\
\ \"acc_norm_stderr\": 0.015517322365529619\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.023083658586984204,\n\
\ \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.023083658586984204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912255,\n\
\ \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912255\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2347266881028939,\n\
\ \"acc_stderr\": 0.024071805887677048,\n \"acc_norm\": 0.2347266881028939,\n\
\ \"acc_norm_stderr\": 0.024071805887677048\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.02289916291844581,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.02289916291844581\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23049645390070922,\n \"acc_stderr\": 0.025123739226872405,\n \
\ \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.025123739226872405\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24967405475880053,\n\
\ \"acc_stderr\": 0.011054538377832318,\n \"acc_norm\": 0.24967405475880053,\n\
\ \"acc_norm_stderr\": 0.011054538377832318\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.1948529411764706,\n \"acc_stderr\": 0.024060599423487428,\n\
\ \"acc_norm\": 0.1948529411764706,\n \"acc_norm_stderr\": 0.024060599423487428\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2630718954248366,\n \"acc_stderr\": 0.017812676542320657,\n \
\ \"acc_norm\": 0.2630718954248366,\n \"acc_norm_stderr\": 0.017812676542320657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17959183673469387,\n \"acc_stderr\": 0.024573293589585637,\n\
\ \"acc_norm\": 0.17959183673469387,\n \"acc_norm_stderr\": 0.024573293589585637\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409217,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409217\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.27710843373493976,\n\
\ \"acc_stderr\": 0.034843315926805875,\n \"acc_norm\": 0.27710843373493976,\n\
\ \"acc_norm_stderr\": 0.034843315926805875\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2573099415204678,\n \"acc_stderr\": 0.03352799844161865,\n\
\ \"acc_norm\": 0.2573099415204678,\n \"acc_norm_stderr\": 0.03352799844161865\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2178702570379437,\n\
\ \"mc1_stderr\": 0.014450846714123892,\n \"mc2\": 0.471292004765754,\n\
\ \"mc2_stderr\": 0.01664156844910162\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Platypus-30B-SuperHOT-8K-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|arc:challenge|25_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hellaswag|10_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T16:25:34.320244.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T16:25:34.320244.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T16_25_34.320244
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T16:25:34.320244.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T16:25:34.320244.parquet'
---
# Dataset Card for Evaluation run of TheBloke/Platypus-30B-SuperHOT-8K-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Platypus-30B-SuperHOT-8K-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Platypus-30B-SuperHOT-8K-fp16](https://huggingface.co/TheBloke/Platypus-30B-SuperHOT-8K-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Platypus-30B-SuperHOT-8K-fp16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T16:25:34.320244](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Platypus-30B-SuperHOT-8K-fp16/blob/main/results_2023-08-18T16%3A25%3A34.320244.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23647488823331855,
"acc_stderr": 0.030908567573023033,
"acc_norm": 0.23771978116158754,
"acc_norm_stderr": 0.030923042741200276,
"mc1": 0.2178702570379437,
"mc1_stderr": 0.014450846714123892,
"mc2": 0.471292004765754,
"mc2_stderr": 0.01664156844910162
},
"harness|arc:challenge|25": {
"acc": 0.21843003412969283,
"acc_stderr": 0.012074291605700987,
"acc_norm": 0.2568259385665529,
"acc_norm_stderr": 0.0127669237941168
},
"harness|hellaswag|10": {
"acc": 0.2731527584146584,
"acc_stderr": 0.004446680081493746,
"acc_norm": 0.3082055367456682,
"acc_norm_stderr": 0.004608082815535489
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2074074074074074,
"acc_stderr": 0.035025531706783186,
"acc_norm": 0.2074074074074074,
"acc_norm_stderr": 0.035025531706783186
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.20394736842105263,
"acc_stderr": 0.032790004063100515,
"acc_norm": 0.20394736842105263,
"acc_norm_stderr": 0.032790004063100515
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22641509433962265,
"acc_stderr": 0.025757559893106748,
"acc_norm": 0.22641509433962265,
"acc_norm_stderr": 0.025757559893106748
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.25957446808510637,
"acc_stderr": 0.02865917937429232,
"acc_norm": 0.25957446808510637,
"acc_norm_stderr": 0.02865917937429232
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436695,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436695
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.036001056927277716,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.036001056927277716
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.040061680838488746,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.040061680838488746
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25483870967741934,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.25483870967741934,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.18226600985221675,
"acc_stderr": 0.02716334085964515,
"acc_norm": 0.18226600985221675,
"acc_norm_stderr": 0.02716334085964515
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.18686868686868688,
"acc_stderr": 0.02777253333421898,
"acc_norm": 0.18686868686868688,
"acc_norm_stderr": 0.02777253333421898
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21243523316062177,
"acc_stderr": 0.02951928261681723,
"acc_norm": 0.21243523316062177,
"acc_norm_stderr": 0.02951928261681723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.021444547301560476,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.021444547301560476
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.024556172219141265,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.024556172219141265
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.026653531596715494,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.026653531596715494
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.17218543046357615,
"acc_stderr": 0.030826136961962396,
"acc_norm": 0.17218543046357615,
"acc_norm_stderr": 0.030826136961962396
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1981651376146789,
"acc_stderr": 0.017090573804217885,
"acc_norm": 0.1981651376146789,
"acc_norm_stderr": 0.017090573804217885
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.02746740180405799,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.02746740180405799
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.28270042194092826,
"acc_stderr": 0.02931281415395592,
"acc_norm": 0.28270042194092826,
"acc_norm_stderr": 0.02931281415395592
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3094170403587444,
"acc_stderr": 0.031024411740572203,
"acc_norm": 0.3094170403587444,
"acc_norm_stderr": 0.031024411740572203
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.29770992366412213,
"acc_stderr": 0.04010358942462203,
"acc_norm": 0.29770992366412213,
"acc_norm_stderr": 0.04010358942462203
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2515964240102171,
"acc_stderr": 0.015517322365529619,
"acc_norm": 0.2515964240102171,
"acc_norm_stderr": 0.015517322365529619
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.024051029739912255,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.024051029739912255
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2347266881028939,
"acc_stderr": 0.024071805887677048,
"acc_norm": 0.2347266881028939,
"acc_norm_stderr": 0.024071805887677048
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.02289916291844581,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.02289916291844581
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23049645390070922,
"acc_stderr": 0.025123739226872405,
"acc_norm": 0.23049645390070922,
"acc_norm_stderr": 0.025123739226872405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24967405475880053,
"acc_stderr": 0.011054538377832318,
"acc_norm": 0.24967405475880053,
"acc_norm_stderr": 0.011054538377832318
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1948529411764706,
"acc_stderr": 0.024060599423487428,
"acc_norm": 0.1948529411764706,
"acc_norm_stderr": 0.024060599423487428
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2630718954248366,
"acc_stderr": 0.017812676542320657,
"acc_norm": 0.2630718954248366,
"acc_norm_stderr": 0.017812676542320657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409217,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409217
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-virology|5": {
"acc": 0.27710843373493976,
"acc_stderr": 0.034843315926805875,
"acc_norm": 0.27710843373493976,
"acc_norm_stderr": 0.034843315926805875
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2573099415204678,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.2573099415204678,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2178702570379437,
"mc1_stderr": 0.014450846714123892,
"mc2": 0.471292004765754,
"mc2_stderr": 0.01664156844910162
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TheBloke__wizard-mega-13B-GPTQ | 2023-08-27T12:45:00.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/wizard-mega-13B-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/wizard-mega-13B-GPTQ](https://huggingface.co/TheBloke/wizard-mega-13B-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__wizard-mega-13B-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-22T10:09:24.633261](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__wizard-mega-13B-GPTQ/blob/main/results_2023-08-22T10%3A09%3A24.633261.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24921936031109732,\n\
\ \"acc_stderr\": 0.031469310713380494,\n \"acc_norm\": 0.25035681128103693,\n\
\ \"acc_norm_stderr\": 0.03148779504732221,\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.015077219200662574,\n \"mc2\": 0.4869109173912817,\n\
\ \"mc2_stderr\": 0.01702324741696185\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2158703071672355,\n \"acc_stderr\": 0.012022975360030684,\n\
\ \"acc_norm\": 0.2773037542662116,\n \"acc_norm_stderr\": 0.013082095839059374\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2544313881696873,\n\
\ \"acc_stderr\": 0.004346509850679535,\n \"acc_norm\": 0.26010754829715194,\n\
\ \"acc_norm_stderr\": 0.004377965074211625\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.17777777777777778,\n\
\ \"acc_stderr\": 0.03302789859901717,\n \"acc_norm\": 0.17777777777777778,\n\
\ \"acc_norm_stderr\": 0.03302789859901717\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.034597776068105345,\n\
\ \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.034597776068105345\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.19,\n\
\ \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.19,\n \
\ \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2490566037735849,\n \"acc_stderr\": 0.026616482980501704,\n\
\ \"acc_norm\": 0.2490566037735849,\n \"acc_norm_stderr\": 0.026616482980501704\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n\
\ \"acc_stderr\": 0.034370793441061344,\n \"acc_norm\": 0.2152777777777778,\n\
\ \"acc_norm_stderr\": 0.034370793441061344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2851063829787234,\n \"acc_stderr\": 0.029513196625539355,\n\
\ \"acc_norm\": 0.2851063829787234,\n \"acc_norm_stderr\": 0.029513196625539355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.31724137931034485,\n \"acc_stderr\": 0.03878352372138623,\n\
\ \"acc_norm\": 0.31724137931034485,\n \"acc_norm_stderr\": 0.03878352372138623\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23544973544973544,\n \"acc_stderr\": 0.021851509822031715,\n \"\
acc_norm\": 0.23544973544973544,\n \"acc_norm_stderr\": 0.021851509822031715\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604672,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604672\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22903225806451613,\n\
\ \"acc_stderr\": 0.02390491431178265,\n \"acc_norm\": 0.22903225806451613,\n\
\ \"acc_norm_stderr\": 0.02390491431178265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.03108982600293752,\n\
\ \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.03108982600293752\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2878787878787879,\n \"acc_stderr\": 0.03225883512300993,\n \"\
acc_norm\": 0.2878787878787879,\n \"acc_norm_stderr\": 0.03225883512300993\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2694300518134715,\n \"acc_stderr\": 0.03201867122877794,\n\
\ \"acc_norm\": 0.2694300518134715,\n \"acc_norm_stderr\": 0.03201867122877794\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.022421273612923703,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.022421273612923703\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23703703703703705,\n \"acc_stderr\": 0.025928876132766135,\n \
\ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.025928876132766135\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2689075630252101,\n \"acc_stderr\": 0.028801392193631273,\n\
\ \"acc_norm\": 0.2689075630252101,\n \"acc_norm_stderr\": 0.028801392193631273\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24220183486238533,\n \"acc_stderr\": 0.01836817630659862,\n \"\
acc_norm\": 0.24220183486238533,\n \"acc_norm_stderr\": 0.01836817630659862\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.03167468706828979,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.03167468706828979\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.22058823529411764,\n \"acc_stderr\": 0.029102254389674082,\n \"\
acc_norm\": 0.22058823529411764,\n \"acc_norm_stderr\": 0.029102254389674082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.21940928270042195,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.21940928270042195,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.22869955156950672,\n\
\ \"acc_stderr\": 0.028188240046929196,\n \"acc_norm\": 0.22869955156950672,\n\
\ \"acc_norm_stderr\": 0.028188240046929196\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.21487603305785125,\n \"acc_stderr\": 0.03749492448709698,\n \"\
acc_norm\": 0.21487603305785125,\n \"acc_norm_stderr\": 0.03749492448709698\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.03322015795776741,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.03322015795776741\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n\
\ \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n\
\ \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.045821241601615506,\n\
\ \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.045821241601615506\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.25213675213675213,\n\
\ \"acc_stderr\": 0.02844796547623101,\n \"acc_norm\": 0.25213675213675213,\n\
\ \"acc_norm_stderr\": 0.02844796547623101\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.22860791826309068,\n\
\ \"acc_stderr\": 0.015016884698539894,\n \"acc_norm\": 0.22860791826309068,\n\
\ \"acc_norm_stderr\": 0.015016884698539894\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26256983240223464,\n\
\ \"acc_stderr\": 0.014716824273017744,\n \"acc_norm\": 0.26256983240223464,\n\
\ \"acc_norm_stderr\": 0.014716824273017744\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.21864951768488747,\n\
\ \"acc_stderr\": 0.02347558141786111,\n \"acc_norm\": 0.21864951768488747,\n\
\ \"acc_norm_stderr\": 0.02347558141786111\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.025171041915309684,\n\
\ \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.025171041915309684\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02601199293090202,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02601199293090202\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26597131681877445,\n\
\ \"acc_stderr\": 0.011285033165551265,\n \"acc_norm\": 0.26597131681877445,\n\
\ \"acc_norm_stderr\": 0.011285033165551265\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.23161764705882354,\n \"acc_stderr\": 0.025626533803777562,\n\
\ \"acc_norm\": 0.23161764705882354,\n \"acc_norm_stderr\": 0.025626533803777562\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26633986928104575,\n \"acc_stderr\": 0.017883188134667206,\n \
\ \"acc_norm\": 0.26633986928104575,\n \"acc_norm_stderr\": 0.017883188134667206\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072775,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072775\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2979591836734694,\n \"acc_stderr\": 0.029279567411065664,\n\
\ \"acc_norm\": 0.2979591836734694,\n \"acc_norm_stderr\": 0.029279567411065664\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2736318407960199,\n\
\ \"acc_stderr\": 0.03152439186555404,\n \"acc_norm\": 0.2736318407960199,\n\
\ \"acc_norm_stderr\": 0.03152439186555404\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2289156626506024,\n\
\ \"acc_stderr\": 0.03270745277352477,\n \"acc_norm\": 0.2289156626506024,\n\
\ \"acc_norm_stderr\": 0.03270745277352477\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.015077219200662574,\n \"mc2\": 0.4869109173912817,\n\
\ \"mc2_stderr\": 0.01702324741696185\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/wizard-mega-13B-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|arc:challenge|25_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hellaswag|10_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T10:09:24.633261.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T10:09:24.633261.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T10_09_24.633261
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T10:09:24.633261.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T10:09:24.633261.parquet'
---
# Dataset Card for Evaluation run of TheBloke/wizard-mega-13B-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/wizard-mega-13B-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/wizard-mega-13B-GPTQ](https://huggingface.co/TheBloke/wizard-mega-13B-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__wizard-mega-13B-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-22T10:09:24.633261](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__wizard-mega-13B-GPTQ/blob/main/results_2023-08-22T10%3A09%3A24.633261.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24921936031109732,
"acc_stderr": 0.031469310713380494,
"acc_norm": 0.25035681128103693,
"acc_norm_stderr": 0.03148779504732221,
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662574,
"mc2": 0.4869109173912817,
"mc2_stderr": 0.01702324741696185
},
"harness|arc:challenge|25": {
"acc": 0.2158703071672355,
"acc_stderr": 0.012022975360030684,
"acc_norm": 0.2773037542662116,
"acc_norm_stderr": 0.013082095839059374
},
"harness|hellaswag|10": {
"acc": 0.2544313881696873,
"acc_stderr": 0.004346509850679535,
"acc_norm": 0.26010754829715194,
"acc_norm_stderr": 0.004377965074211625
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.17777777777777778,
"acc_stderr": 0.03302789859901717,
"acc_norm": 0.17777777777777778,
"acc_norm_stderr": 0.03302789859901717
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.034597776068105345,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.034597776068105345
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2490566037735849,
"acc_stderr": 0.026616482980501704,
"acc_norm": 0.2490566037735849,
"acc_norm_stderr": 0.026616482980501704
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2152777777777778,
"acc_stderr": 0.034370793441061344,
"acc_norm": 0.2152777777777778,
"acc_norm_stderr": 0.034370793441061344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2851063829787234,
"acc_stderr": 0.029513196625539355,
"acc_norm": 0.2851063829787234,
"acc_norm_stderr": 0.029513196625539355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.31724137931034485,
"acc_stderr": 0.03878352372138623,
"acc_norm": 0.31724137931034485,
"acc_norm_stderr": 0.03878352372138623
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23544973544973544,
"acc_stderr": 0.021851509822031715,
"acc_norm": 0.23544973544973544,
"acc_norm_stderr": 0.021851509822031715
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604672,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604672
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22903225806451613,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.22903225806451613,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.03108982600293752,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.03108982600293752
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2878787878787879,
"acc_stderr": 0.03225883512300993,
"acc_norm": 0.2878787878787879,
"acc_norm_stderr": 0.03225883512300993
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2694300518134715,
"acc_stderr": 0.03201867122877794,
"acc_norm": 0.2694300518134715,
"acc_norm_stderr": 0.03201867122877794
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.022421273612923703,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.022421273612923703
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.025928876132766135,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.025928876132766135
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2689075630252101,
"acc_stderr": 0.028801392193631273,
"acc_norm": 0.2689075630252101,
"acc_norm_stderr": 0.028801392193631273
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24220183486238533,
"acc_stderr": 0.01836817630659862,
"acc_norm": 0.24220183486238533,
"acc_norm_stderr": 0.01836817630659862
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.03167468706828979,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.03167468706828979
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.029102254389674082,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.029102254389674082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.21940928270042195,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.21940928270042195,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.22869955156950672,
"acc_stderr": 0.028188240046929196,
"acc_norm": 0.22869955156950672,
"acc_norm_stderr": 0.028188240046929196
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.21487603305785125,
"acc_stderr": 0.03749492448709698,
"acc_norm": 0.21487603305785125,
"acc_norm_stderr": 0.03749492448709698
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.03322015795776741,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.03322015795776741
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.045821241601615506,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.045821241601615506
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.25213675213675213,
"acc_stderr": 0.02844796547623101,
"acc_norm": 0.25213675213675213,
"acc_norm_stderr": 0.02844796547623101
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.22860791826309068,
"acc_stderr": 0.015016884698539894,
"acc_norm": 0.22860791826309068,
"acc_norm_stderr": 0.015016884698539894
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26256983240223464,
"acc_stderr": 0.014716824273017744,
"acc_norm": 0.26256983240223464,
"acc_norm_stderr": 0.014716824273017744
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.21864951768488747,
"acc_stderr": 0.02347558141786111,
"acc_norm": 0.21864951768488747,
"acc_norm_stderr": 0.02347558141786111
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.025171041915309684,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.025171041915309684
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.02601199293090202,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.02601199293090202
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26597131681877445,
"acc_stderr": 0.011285033165551265,
"acc_norm": 0.26597131681877445,
"acc_norm_stderr": 0.011285033165551265
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.23161764705882354,
"acc_stderr": 0.025626533803777562,
"acc_norm": 0.23161764705882354,
"acc_norm_stderr": 0.025626533803777562
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26633986928104575,
"acc_stderr": 0.017883188134667206,
"acc_norm": 0.26633986928104575,
"acc_norm_stderr": 0.017883188134667206
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072775,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2979591836734694,
"acc_stderr": 0.029279567411065664,
"acc_norm": 0.2979591836734694,
"acc_norm_stderr": 0.029279567411065664
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2736318407960199,
"acc_stderr": 0.03152439186555404,
"acc_norm": 0.2736318407960199,
"acc_norm_stderr": 0.03152439186555404
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2289156626506024,
"acc_stderr": 0.03270745277352477,
"acc_norm": 0.2289156626506024,
"acc_norm_stderr": 0.03270745277352477
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662574,
"mc2": 0.4869109173912817,
"mc2_stderr": 0.01702324741696185
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TheBloke__Project-Baize-v2-13B-GPTQ | 2023-08-27T12:45:02.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/Project-Baize-v2-13B-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Project-Baize-v2-13B-GPTQ](https://huggingface.co/TheBloke/Project-Baize-v2-13B-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Project-Baize-v2-13B-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-22T13:47:48.408564](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Project-Baize-v2-13B-GPTQ/blob/main/results_2023-08-22T13%3A47%3A48.408564.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2587648736963,\n\
\ \"acc_stderr\": 0.03187184494961934,\n \"acc_norm\": 0.2594229387695995,\n\
\ \"acc_norm_stderr\": 0.03188074586823278,\n \"mc1\": 0.24112607099143207,\n\
\ \"mc1_stderr\": 0.014974827279752339,\n \"mc2\": 0.48217112656241606,\n\
\ \"mc2_stderr\": 0.01706048623340291\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.24573378839590443,\n \"acc_stderr\": 0.01258103345373011,\n\
\ \"acc_norm\": 0.27559726962457337,\n \"acc_norm_stderr\": 0.013057169655761838\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2552280422226648,\n\
\ \"acc_stderr\": 0.004350982826580599,\n \"acc_norm\": 0.26419040031866164,\n\
\ \"acc_norm_stderr\": 0.004400000822742062\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.24342105263157895,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.24342105263157895,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.20754716981132076,\n \"acc_stderr\": 0.02495991802891127,\n\
\ \"acc_norm\": 0.20754716981132076,\n \"acc_norm_stderr\": 0.02495991802891127\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.03186209851641143,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.03186209851641143\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.02924188386962881,\n\
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.02924188386962881\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24074074074074073,\n \"acc_stderr\": 0.0220190800122179,\n \"\
acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.0220190800122179\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.038095238095238106,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.038095238095238106\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.23225806451612904,\n \"acc_stderr\": 0.02402225613030824,\n \"\
acc_norm\": 0.23225806451612904,\n \"acc_norm_stderr\": 0.02402225613030824\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.03178529710642751,\n \"\
acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.03178529710642751\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25757575757575757,\n \"acc_stderr\": 0.03115626951964686,\n \"\
acc_norm\": 0.25757575757575757,\n \"acc_norm_stderr\": 0.03115626951964686\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.20207253886010362,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.20207253886010362,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.02075242372212802,\n \
\ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.02075242372212802\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.026653531596715484,\n\
\ \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.026653531596715484\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.0347918557259966,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.0347918557259966\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22568807339449543,\n \"acc_stderr\": 0.017923087667803057,\n \"\
acc_norm\": 0.22568807339449543,\n \"acc_norm_stderr\": 0.017923087667803057\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2916666666666667,\n \"acc_stderr\": 0.03099866630456053,\n \"\
acc_norm\": 0.2916666666666667,\n \"acc_norm_stderr\": 0.03099866630456053\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2647058823529412,\n \"acc_stderr\": 0.030964517926923393,\n \"\
acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.030964517926923393\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.21940928270042195,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.21940928270042195,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2600896860986547,\n\
\ \"acc_stderr\": 0.029442495585857473,\n \"acc_norm\": 0.2600896860986547,\n\
\ \"acc_norm_stderr\": 0.029442495585857473\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.16030534351145037,\n \"acc_stderr\": 0.0321782942074463,\n\
\ \"acc_norm\": 0.16030534351145037,\n \"acc_norm_stderr\": 0.0321782942074463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2727272727272727,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.034624199316156234,\n\
\ \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.034624199316156234\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.20535714285714285,\n\
\ \"acc_stderr\": 0.038342410214190714,\n \"acc_norm\": 0.20535714285714285,\n\
\ \"acc_norm_stderr\": 0.038342410214190714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24786324786324787,\n\
\ \"acc_stderr\": 0.0282863240755644,\n \"acc_norm\": 0.24786324786324787,\n\
\ \"acc_norm_stderr\": 0.0282863240755644\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2720306513409962,\n\
\ \"acc_stderr\": 0.015913367447500524,\n \"acc_norm\": 0.2720306513409962,\n\
\ \"acc_norm_stderr\": 0.015913367447500524\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2630057803468208,\n \"acc_stderr\": 0.023703099525258176,\n\
\ \"acc_norm\": 0.2630057803468208,\n \"acc_norm_stderr\": 0.023703099525258176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n\
\ \"acc_stderr\": 0.014444157808261446,\n \"acc_norm\": 0.24804469273743016,\n\
\ \"acc_norm_stderr\": 0.014444157808261446\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02564686309713791,\n\
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02564686309713791\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2861736334405145,\n\
\ \"acc_stderr\": 0.025670259242188947,\n \"acc_norm\": 0.2861736334405145,\n\
\ \"acc_norm_stderr\": 0.025670259242188947\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.31790123456790126,\n \"acc_stderr\": 0.02591006352824087,\n\
\ \"acc_norm\": 0.31790123456790126,\n \"acc_norm_stderr\": 0.02591006352824087\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.28368794326241137,\n \"acc_stderr\": 0.026891709428343957,\n \
\ \"acc_norm\": 0.28368794326241137,\n \"acc_norm_stderr\": 0.026891709428343957\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26140808344198174,\n\
\ \"acc_stderr\": 0.01122252816977131,\n \"acc_norm\": 0.26140808344198174,\n\
\ \"acc_norm_stderr\": 0.01122252816977131\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.30514705882352944,\n \"acc_stderr\": 0.0279715413701706,\n\
\ \"acc_norm\": 0.30514705882352944,\n \"acc_norm_stderr\": 0.0279715413701706\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2647058823529412,\n \"acc_stderr\": 0.017848089574913226,\n \
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.017848089574913226\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3224489795918367,\n \"acc_stderr\": 0.029923100563683903,\n\
\ \"acc_norm\": 0.3224489795918367,\n \"acc_norm_stderr\": 0.029923100563683903\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2835820895522388,\n\
\ \"acc_stderr\": 0.031871875379197966,\n \"acc_norm\": 0.2835820895522388,\n\
\ \"acc_norm_stderr\": 0.031871875379197966\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.034605799075530255,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.034605799075530255\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21637426900584794,\n \"acc_stderr\": 0.03158149539338733,\n\
\ \"acc_norm\": 0.21637426900584794,\n \"acc_norm_stderr\": 0.03158149539338733\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24112607099143207,\n\
\ \"mc1_stderr\": 0.014974827279752339,\n \"mc2\": 0.48217112656241606,\n\
\ \"mc2_stderr\": 0.01706048623340291\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Project-Baize-v2-13B-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|arc:challenge|25_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hellaswag|10_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:47:48.408564.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:47:48.408564.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T13_47_48.408564
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T13:47:48.408564.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T13:47:48.408564.parquet'
---
# Dataset Card for Evaluation run of TheBloke/Project-Baize-v2-13B-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Project-Baize-v2-13B-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Project-Baize-v2-13B-GPTQ](https://huggingface.co/TheBloke/Project-Baize-v2-13B-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Project-Baize-v2-13B-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-22T13:47:48.408564](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Project-Baize-v2-13B-GPTQ/blob/main/results_2023-08-22T13%3A47%3A48.408564.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2587648736963,
"acc_stderr": 0.03187184494961934,
"acc_norm": 0.2594229387695995,
"acc_norm_stderr": 0.03188074586823278,
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752339,
"mc2": 0.48217112656241606,
"mc2_stderr": 0.01706048623340291
},
"harness|arc:challenge|25": {
"acc": 0.24573378839590443,
"acc_stderr": 0.01258103345373011,
"acc_norm": 0.27559726962457337,
"acc_norm_stderr": 0.013057169655761838
},
"harness|hellaswag|10": {
"acc": 0.2552280422226648,
"acc_stderr": 0.004350982826580599,
"acc_norm": 0.26419040031866164,
"acc_norm_stderr": 0.004400000822742062
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.24342105263157895,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.24342105263157895,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.20754716981132076,
"acc_stderr": 0.02495991802891127,
"acc_norm": 0.20754716981132076,
"acc_norm_stderr": 0.02495991802891127
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641143,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641143
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.02924188386962881,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.02924188386962881
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.03695183311650232,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.03695183311650232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.0220190800122179,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.0220190800122179
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.038095238095238106,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.038095238095238106
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23225806451612904,
"acc_stderr": 0.02402225613030824,
"acc_norm": 0.23225806451612904,
"acc_norm_stderr": 0.02402225613030824
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.03178529710642751,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.03178529710642751
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25757575757575757,
"acc_stderr": 0.03115626951964686,
"acc_norm": 0.25757575757575757,
"acc_norm_stderr": 0.03115626951964686
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20207253886010362,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.20207253886010362,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.02075242372212802,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.02075242372212802
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.026653531596715484,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.026653531596715484
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.0347918557259966,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.0347918557259966
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22568807339449543,
"acc_stderr": 0.017923087667803057,
"acc_norm": 0.22568807339449543,
"acc_norm_stderr": 0.017923087667803057
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.03099866630456053,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.03099866630456053
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.030964517926923393,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.030964517926923393
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.21940928270042195,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.21940928270042195,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2600896860986547,
"acc_stderr": 0.029442495585857473,
"acc_norm": 0.2600896860986547,
"acc_norm_stderr": 0.029442495585857473
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.16030534351145037,
"acc_stderr": 0.0321782942074463,
"acc_norm": 0.16030534351145037,
"acc_norm_stderr": 0.0321782942074463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26380368098159507,
"acc_stderr": 0.034624199316156234,
"acc_norm": 0.26380368098159507,
"acc_norm_stderr": 0.034624199316156234
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.20535714285714285,
"acc_stderr": 0.038342410214190714,
"acc_norm": 0.20535714285714285,
"acc_norm_stderr": 0.038342410214190714
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24786324786324787,
"acc_stderr": 0.0282863240755644,
"acc_norm": 0.24786324786324787,
"acc_norm_stderr": 0.0282863240755644
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2720306513409962,
"acc_stderr": 0.015913367447500524,
"acc_norm": 0.2720306513409962,
"acc_norm_stderr": 0.015913367447500524
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2630057803468208,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.2630057803468208,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24804469273743016,
"acc_stderr": 0.014444157808261446,
"acc_norm": 0.24804469273743016,
"acc_norm_stderr": 0.014444157808261446
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02564686309713791,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02564686309713791
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2861736334405145,
"acc_stderr": 0.025670259242188947,
"acc_norm": 0.2861736334405145,
"acc_norm_stderr": 0.025670259242188947
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.31790123456790126,
"acc_stderr": 0.02591006352824087,
"acc_norm": 0.31790123456790126,
"acc_norm_stderr": 0.02591006352824087
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.28368794326241137,
"acc_stderr": 0.026891709428343957,
"acc_norm": 0.28368794326241137,
"acc_norm_stderr": 0.026891709428343957
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26140808344198174,
"acc_stderr": 0.01122252816977131,
"acc_norm": 0.26140808344198174,
"acc_norm_stderr": 0.01122252816977131
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.30514705882352944,
"acc_stderr": 0.0279715413701706,
"acc_norm": 0.30514705882352944,
"acc_norm_stderr": 0.0279715413701706
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.017848089574913226,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.017848089574913226
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3224489795918367,
"acc_stderr": 0.029923100563683903,
"acc_norm": 0.3224489795918367,
"acc_norm_stderr": 0.029923100563683903
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2835820895522388,
"acc_stderr": 0.031871875379197966,
"acc_norm": 0.2835820895522388,
"acc_norm_stderr": 0.031871875379197966
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.034605799075530255,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.034605799075530255
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21637426900584794,
"acc_stderr": 0.03158149539338733,
"acc_norm": 0.21637426900584794,
"acc_norm_stderr": 0.03158149539338733
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752339,
"mc2": 0.48217112656241606,
"mc2_stderr": 0.01706048623340291
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TheBloke__airoboros-33B-gpt4-1-4-SuperHOT-8K-fp16 | 2023-08-27T12:45:03.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/airoboros-33B-gpt4-1-4-SuperHOT-8K-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/airoboros-33B-gpt4-1-4-SuperHOT-8K-fp16](https://huggingface.co/TheBloke/airoboros-33B-gpt4-1-4-SuperHOT-8K-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__airoboros-33B-gpt4-1-4-SuperHOT-8K-fp16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-22T21:17:33.530104](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__airoboros-33B-gpt4-1-4-SuperHOT-8K-fp16/blob/main/results_2023-08-22T21%3A17%3A33.530104.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23623967517263394,\n\
\ \"acc_stderr\": 0.03091679562106068,\n \"acc_norm\": 0.23734838989030832,\n\
\ \"acc_norm_stderr\": 0.030928364256296522,\n \"mc1\": 0.22031823745410037,\n\
\ \"mc1_stderr\": 0.014509045171487283,\n \"mc2\": 0.4792055778955594,\n\
\ \"mc2_stderr\": 0.016809354273525978\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.23037542662116042,\n \"acc_stderr\": 0.01230492841874761,\n\
\ \"acc_norm\": 0.26023890784982934,\n \"acc_norm_stderr\": 0.012821930225112554\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2709619597689703,\n\
\ \"acc_stderr\": 0.0044354815159093975,\n \"acc_norm\": 0.30651264688309104,\n\
\ \"acc_norm_stderr\": 0.004601029188459098\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.22264150943396227,\n \"acc_stderr\": 0.025604233470899095,\n\
\ \"acc_norm\": 0.22264150943396227,\n \"acc_norm_stderr\": 0.025604233470899095\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036844,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036844\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.267741935483871,\n \"acc_stderr\": 0.025189006660212378,\n \"\
acc_norm\": 0.267741935483871,\n \"acc_norm_stderr\": 0.025189006660212378\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15763546798029557,\n \"acc_stderr\": 0.025639014131172404,\n \"\
acc_norm\": 0.15763546798029557,\n \"acc_norm_stderr\": 0.025639014131172404\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.18686868686868688,\n \"acc_stderr\": 0.027772533334218977,\n \"\
acc_norm\": 0.18686868686868688,\n \"acc_norm_stderr\": 0.027772533334218977\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.18652849740932642,\n \"acc_stderr\": 0.028112091210117485,\n\
\ \"acc_norm\": 0.18652849740932642,\n \"acc_norm_stderr\": 0.028112091210117485\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2076923076923077,\n \"acc_stderr\": 0.020567539567246797,\n\
\ \"acc_norm\": 0.2076923076923077,\n \"acc_norm_stderr\": 0.020567539567246797\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275794,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275794\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.25210084033613445,\n \"acc_stderr\": 0.028205545033277723,\n\
\ \"acc_norm\": 0.25210084033613445,\n \"acc_norm_stderr\": 0.028205545033277723\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2119205298013245,\n \"acc_stderr\": 0.03336767086567976,\n \"\
acc_norm\": 0.2119205298013245,\n \"acc_norm_stderr\": 0.03336767086567976\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1944954128440367,\n \"acc_stderr\": 0.016970289090458047,\n \"\
acc_norm\": 0.1944954128440367,\n \"acc_norm_stderr\": 0.016970289090458047\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2742616033755274,\n \"acc_stderr\": 0.02904133351059804,\n \
\ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.02904133351059804\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.29914529914529914,\n\
\ \"acc_stderr\": 0.029996951858349497,\n \"acc_norm\": 0.29914529914529914,\n\
\ \"acc_norm_stderr\": 0.029996951858349497\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23371647509578544,\n\
\ \"acc_stderr\": 0.015133383278988837,\n \"acc_norm\": 0.23371647509578544,\n\
\ \"acc_norm_stderr\": 0.015133383278988837\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.1830065359477124,\n \"acc_stderr\": 0.022140767512880976,\n\
\ \"acc_norm\": 0.1830065359477124,\n \"acc_norm_stderr\": 0.022140767512880976\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2191358024691358,\n \"acc_stderr\": 0.023016705640262203,\n\
\ \"acc_norm\": 0.2191358024691358,\n \"acc_norm_stderr\": 0.023016705640262203\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.22695035460992907,\n \"acc_stderr\": 0.02498710636564297,\n \
\ \"acc_norm\": 0.22695035460992907,\n \"acc_norm_stderr\": 0.02498710636564297\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25326797385620914,\n \"acc_stderr\": 0.01759348689536683,\n \
\ \"acc_norm\": 0.25326797385620914,\n \"acc_norm_stderr\": 0.01759348689536683\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17959183673469387,\n \"acc_stderr\": 0.024573293589585637,\n\
\ \"acc_norm\": 0.17959183673469387,\n \"acc_norm_stderr\": 0.024573293589585637\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\
\ \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n\
\ \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.035650796707083106,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.035650796707083106\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22031823745410037,\n\
\ \"mc1_stderr\": 0.014509045171487283,\n \"mc2\": 0.4792055778955594,\n\
\ \"mc2_stderr\": 0.016809354273525978\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/airoboros-33B-gpt4-1-4-SuperHOT-8K-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|arc:challenge|25_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hellaswag|10_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T21:17:33.530104.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T21:17:33.530104.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T21_17_33.530104
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T21:17:33.530104.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T21:17:33.530104.parquet'
---
# Dataset Card for Evaluation run of TheBloke/airoboros-33B-gpt4-1-4-SuperHOT-8K-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/airoboros-33B-gpt4-1-4-SuperHOT-8K-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/airoboros-33B-gpt4-1-4-SuperHOT-8K-fp16](https://huggingface.co/TheBloke/airoboros-33B-gpt4-1-4-SuperHOT-8K-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__airoboros-33B-gpt4-1-4-SuperHOT-8K-fp16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-22T21:17:33.530104](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__airoboros-33B-gpt4-1-4-SuperHOT-8K-fp16/blob/main/results_2023-08-22T21%3A17%3A33.530104.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23623967517263394,
"acc_stderr": 0.03091679562106068,
"acc_norm": 0.23734838989030832,
"acc_norm_stderr": 0.030928364256296522,
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487283,
"mc2": 0.4792055778955594,
"mc2_stderr": 0.016809354273525978
},
"harness|arc:challenge|25": {
"acc": 0.23037542662116042,
"acc_stderr": 0.01230492841874761,
"acc_norm": 0.26023890784982934,
"acc_norm_stderr": 0.012821930225112554
},
"harness|hellaswag|10": {
"acc": 0.2709619597689703,
"acc_stderr": 0.0044354815159093975,
"acc_norm": 0.30651264688309104,
"acc_norm_stderr": 0.004601029188459098
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22264150943396227,
"acc_stderr": 0.025604233470899095,
"acc_norm": 0.22264150943396227,
"acc_norm_stderr": 0.025604233470899095
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.267741935483871,
"acc_stderr": 0.025189006660212378,
"acc_norm": 0.267741935483871,
"acc_norm_stderr": 0.025189006660212378
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15763546798029557,
"acc_stderr": 0.025639014131172404,
"acc_norm": 0.15763546798029557,
"acc_norm_stderr": 0.025639014131172404
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.18686868686868688,
"acc_stderr": 0.027772533334218977,
"acc_norm": 0.18686868686868688,
"acc_norm_stderr": 0.027772533334218977
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.18652849740932642,
"acc_stderr": 0.028112091210117485,
"acc_norm": 0.18652849740932642,
"acc_norm_stderr": 0.028112091210117485
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2076923076923077,
"acc_stderr": 0.020567539567246797,
"acc_norm": 0.2076923076923077,
"acc_norm_stderr": 0.020567539567246797
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275794,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275794
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25210084033613445,
"acc_stderr": 0.028205545033277723,
"acc_norm": 0.25210084033613445,
"acc_norm_stderr": 0.028205545033277723
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2119205298013245,
"acc_stderr": 0.03336767086567976,
"acc_norm": 0.2119205298013245,
"acc_norm_stderr": 0.03336767086567976
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1944954128440367,
"acc_stderr": 0.016970289090458047,
"acc_norm": 0.1944954128440367,
"acc_norm_stderr": 0.016970289090458047
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.02904133351059804,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.02904133351059804
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.29914529914529914,
"acc_stderr": 0.029996951858349497,
"acc_norm": 0.29914529914529914,
"acc_norm_stderr": 0.029996951858349497
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23371647509578544,
"acc_stderr": 0.015133383278988837,
"acc_norm": 0.23371647509578544,
"acc_norm_stderr": 0.015133383278988837
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.1830065359477124,
"acc_stderr": 0.022140767512880976,
"acc_norm": 0.1830065359477124,
"acc_norm_stderr": 0.022140767512880976
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2191358024691358,
"acc_stderr": 0.023016705640262203,
"acc_norm": 0.2191358024691358,
"acc_norm_stderr": 0.023016705640262203
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22695035460992907,
"acc_stderr": 0.02498710636564297,
"acc_norm": 0.22695035460992907,
"acc_norm_stderr": 0.02498710636564297
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25326797385620914,
"acc_stderr": 0.01759348689536683,
"acc_norm": 0.25326797385620914,
"acc_norm_stderr": 0.01759348689536683
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.035650796707083106,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.035650796707083106
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487283,
"mc2": 0.4792055778955594,
"mc2_stderr": 0.016809354273525978
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TheBloke__orca_mini_v3_7B-GPTQ | 2023-08-27T12:45:05.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/orca_mini_v3_7B-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/orca_mini_v3_7B-GPTQ](https://huggingface.co/TheBloke/orca_mini_v3_7B-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__orca_mini_v3_7B-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-22T13:46:10.418493](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__orca_mini_v3_7B-GPTQ/blob/main/results_2023-08-22T13%3A46%3A10.418493.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24319179243780645,\n\
\ \"acc_stderr\": 0.031158862195277914,\n \"acc_norm\": 0.244389737294318,\n\
\ \"acc_norm_stderr\": 0.031176402192385405,\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570345,\n \"mc2\": 0.48438228589364474,\n\
\ \"mc2_stderr\": 0.01700627816360895\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2354948805460751,\n \"acc_stderr\": 0.012399451855004748,\n\
\ \"acc_norm\": 0.30119453924914674,\n \"acc_norm_stderr\": 0.013406741767847617\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25492929695279826,\n\
\ \"acc_stderr\": 0.0043493077027351645,\n \"acc_norm\": 0.2599083847839076,\n\
\ \"acc_norm_stderr\": 0.0043768776192341175\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.26666666666666666,\n\
\ \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.26666666666666666,\n\
\ \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677088,\n\
\ \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677088\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n\
\ \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.24,\n\
\ \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\
\ \"acc_stderr\": 0.031265112061730424,\n \"acc_norm\": 0.2138728323699422,\n\
\ \"acc_norm_stderr\": 0.031265112061730424\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617749,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617749\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.28085106382978725,\n \"acc_stderr\": 0.02937917046412482,\n\
\ \"acc_norm\": 0.28085106382978725,\n \"acc_norm_stderr\": 0.02937917046412482\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537317,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537317\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924811,\n\
\ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924811\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525218,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525218\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\
\ \"acc_stderr\": 0.03567016675276863,\n \"acc_norm\": 0.1984126984126984,\n\
\ \"acc_norm_stderr\": 0.03567016675276863\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n\
\ \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.24838709677419354,\n\
\ \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.0316185633535861,\n\
\ \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.0316185633535861\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.18686868686868688,\n \"acc_stderr\": 0.02777253333421898,\n \"\
acc_norm\": 0.18686868686868688,\n \"acc_norm_stderr\": 0.02777253333421898\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.24352331606217617,\n \"acc_stderr\": 0.030975436386845436,\n\
\ \"acc_norm\": 0.24352331606217617,\n \"acc_norm_stderr\": 0.030975436386845436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23333333333333334,\n \"acc_stderr\": 0.021444547301560486,\n\
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.021444547301560486\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.02708037281514568,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.02708037281514568\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868963,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868963\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.0347918557259966,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.0347918557259966\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.25137614678899084,\n \"acc_stderr\": 0.018599206360287415,\n \"\
acc_norm\": 0.25137614678899084,\n \"acc_norm_stderr\": 0.018599206360287415\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.14351851851851852,\n \"acc_stderr\": 0.02391077925264438,\n \"\
acc_norm\": 0.14351851851851852,\n \"acc_norm_stderr\": 0.02391077925264438\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.29411764705882354,\n \"acc_stderr\": 0.0319800166011507,\n \"\
acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.0319800166011507\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2320675105485232,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.2320675105485232,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.28699551569506726,\n\
\ \"acc_stderr\": 0.030360379710291947,\n \"acc_norm\": 0.28699551569506726,\n\
\ \"acc_norm_stderr\": 0.030360379710291947\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.20610687022900764,\n \"acc_stderr\": 0.03547771004159464,\n\
\ \"acc_norm\": 0.20610687022900764,\n \"acc_norm_stderr\": 0.03547771004159464\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3305785123966942,\n \"acc_stderr\": 0.04294340845212094,\n \"\
acc_norm\": 0.3305785123966942,\n \"acc_norm_stderr\": 0.04294340845212094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.14563106796116504,\n \"acc_stderr\": 0.034926064766237906,\n\
\ \"acc_norm\": 0.14563106796116504,\n \"acc_norm_stderr\": 0.034926064766237906\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24358974358974358,\n\
\ \"acc_stderr\": 0.028120966503914414,\n \"acc_norm\": 0.24358974358974358,\n\
\ \"acc_norm_stderr\": 0.028120966503914414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24776500638569604,\n\
\ \"acc_stderr\": 0.01543808308056897,\n \"acc_norm\": 0.24776500638569604,\n\
\ \"acc_norm_stderr\": 0.01543808308056897\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.22254335260115607,\n \"acc_stderr\": 0.02239421566194282,\n\
\ \"acc_norm\": 0.22254335260115607,\n \"acc_norm_stderr\": 0.02239421566194282\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961447,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961447\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21895424836601307,\n \"acc_stderr\": 0.02367908986180772,\n\
\ \"acc_norm\": 0.21895424836601307,\n \"acc_norm_stderr\": 0.02367908986180772\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2379421221864952,\n\
\ \"acc_stderr\": 0.024185150647818707,\n \"acc_norm\": 0.2379421221864952,\n\
\ \"acc_norm_stderr\": 0.024185150647818707\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24691358024691357,\n \"acc_stderr\": 0.02399350170904211,\n\
\ \"acc_norm\": 0.24691358024691357,\n \"acc_norm_stderr\": 0.02399350170904211\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24468085106382978,\n \"acc_stderr\": 0.025645553622266736,\n \
\ \"acc_norm\": 0.24468085106382978,\n \"acc_norm_stderr\": 0.025645553622266736\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25358539765319427,\n\
\ \"acc_stderr\": 0.011111715336101129,\n \"acc_norm\": 0.25358539765319427,\n\
\ \"acc_norm_stderr\": 0.011111715336101129\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.15808823529411764,\n \"acc_stderr\": 0.02216146260806851,\n\
\ \"acc_norm\": 0.15808823529411764,\n \"acc_norm_stderr\": 0.02216146260806851\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26143790849673204,\n \"acc_stderr\": 0.01777694715752803,\n \
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.01777694715752803\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.33636363636363636,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.33636363636363636,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17142857142857143,\n \"acc_stderr\": 0.024127463462650153,\n\
\ \"acc_norm\": 0.17142857142857143,\n \"acc_norm_stderr\": 0.024127463462650153\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.029929415408348398,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.029929415408348398\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3493975903614458,\n\
\ \"acc_stderr\": 0.03711725190740749,\n \"acc_norm\": 0.3493975903614458,\n\
\ \"acc_norm_stderr\": 0.03711725190740749\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.25146198830409355,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.25146198830409355,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570345,\n \"mc2\": 0.48438228589364474,\n\
\ \"mc2_stderr\": 0.01700627816360895\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/orca_mini_v3_7B-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|arc:challenge|25_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hellaswag|10_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:46:10.418493.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:46:10.418493.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T13_46_10.418493
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T13:46:10.418493.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T13:46:10.418493.parquet'
---
# Dataset Card for Evaluation run of TheBloke/orca_mini_v3_7B-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/orca_mini_v3_7B-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/orca_mini_v3_7B-GPTQ](https://huggingface.co/TheBloke/orca_mini_v3_7B-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__orca_mini_v3_7B-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-22T13:46:10.418493](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__orca_mini_v3_7B-GPTQ/blob/main/results_2023-08-22T13%3A46%3A10.418493.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24319179243780645,
"acc_stderr": 0.031158862195277914,
"acc_norm": 0.244389737294318,
"acc_norm_stderr": 0.031176402192385405,
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570345,
"mc2": 0.48438228589364474,
"mc2_stderr": 0.01700627816360895
},
"harness|arc:challenge|25": {
"acc": 0.2354948805460751,
"acc_stderr": 0.012399451855004748,
"acc_norm": 0.30119453924914674,
"acc_norm_stderr": 0.013406741767847617
},
"harness|hellaswag|10": {
"acc": 0.25492929695279826,
"acc_stderr": 0.0043493077027351645,
"acc_norm": 0.2599083847839076,
"acc_norm_stderr": 0.0043768776192341175
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17105263157894737,
"acc_stderr": 0.030643607071677088,
"acc_norm": 0.17105263157894737,
"acc_norm_stderr": 0.030643607071677088
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.031265112061730424,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.031265112061730424
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617749,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617749
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28085106382978725,
"acc_stderr": 0.02937917046412482,
"acc_norm": 0.28085106382978725,
"acc_norm_stderr": 0.02937917046412482
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537317,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537317
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924811,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924811
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525218,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525218
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276863,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276863
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.0316185633535861,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.0316185633535861
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.18686868686868688,
"acc_stderr": 0.02777253333421898,
"acc_norm": 0.18686868686868688,
"acc_norm_stderr": 0.02777253333421898
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24352331606217617,
"acc_stderr": 0.030975436386845436,
"acc_norm": 0.24352331606217617,
"acc_norm_stderr": 0.030975436386845436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.021444547301560486,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.021444547301560486
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.02708037281514568,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.02708037281514568
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868963,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868963
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.0347918557259966,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.0347918557259966
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25137614678899084,
"acc_stderr": 0.018599206360287415,
"acc_norm": 0.25137614678899084,
"acc_norm_stderr": 0.018599206360287415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.14351851851851852,
"acc_stderr": 0.02391077925264438,
"acc_norm": 0.14351851851851852,
"acc_norm_stderr": 0.02391077925264438
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.0319800166011507,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.0319800166011507
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2320675105485232,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.2320675105485232,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.28699551569506726,
"acc_stderr": 0.030360379710291947,
"acc_norm": 0.28699551569506726,
"acc_norm_stderr": 0.030360379710291947
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.20610687022900764,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.20610687022900764,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3305785123966942,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.3305785123966942,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.14563106796116504,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.14563106796116504,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.028120966503914414,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.028120966503914414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24776500638569604,
"acc_stderr": 0.01543808308056897,
"acc_norm": 0.24776500638569604,
"acc_norm_stderr": 0.01543808308056897
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22254335260115607,
"acc_stderr": 0.02239421566194282,
"acc_norm": 0.22254335260115607,
"acc_norm_stderr": 0.02239421566194282
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961447,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961447
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21895424836601307,
"acc_stderr": 0.02367908986180772,
"acc_norm": 0.21895424836601307,
"acc_norm_stderr": 0.02367908986180772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2379421221864952,
"acc_stderr": 0.024185150647818707,
"acc_norm": 0.2379421221864952,
"acc_norm_stderr": 0.024185150647818707
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24691358024691357,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.24691358024691357,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24468085106382978,
"acc_stderr": 0.025645553622266736,
"acc_norm": 0.24468085106382978,
"acc_norm_stderr": 0.025645553622266736
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25358539765319427,
"acc_stderr": 0.011111715336101129,
"acc_norm": 0.25358539765319427,
"acc_norm_stderr": 0.011111715336101129
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.15808823529411764,
"acc_stderr": 0.02216146260806851,
"acc_norm": 0.15808823529411764,
"acc_norm_stderr": 0.02216146260806851
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.01777694715752803,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.01777694715752803
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.33636363636363636,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.33636363636363636,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17142857142857143,
"acc_stderr": 0.024127463462650153,
"acc_norm": 0.17142857142857143,
"acc_norm_stderr": 0.024127463462650153
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348398,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348398
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3493975903614458,
"acc_stderr": 0.03711725190740749,
"acc_norm": 0.3493975903614458,
"acc_norm_stderr": 0.03711725190740749
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.25146198830409355,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.25146198830409355,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570345,
"mc2": 0.48438228589364474,
"mc2_stderr": 0.01700627816360895
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TheBloke__openchat_v2_openorca_preview-GPTQ | 2023-08-27T12:45:07.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/openchat_v2_openorca_preview-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/openchat_v2_openorca_preview-GPTQ](https://huggingface.co/TheBloke/openchat_v2_openorca_preview-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__openchat_v2_openorca_preview-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-22T11:30:59.875390](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__openchat_v2_openorca_preview-GPTQ/blob/main/results_2023-08-22T11%3A30%3A59.875390.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24246694991848233,\n\
\ \"acc_stderr\": 0.03117174175399139,\n \"acc_norm\": 0.2433122513905999,\n\
\ \"acc_norm_stderr\": 0.03118519546005906,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931598,\n \"mc2\": 0.5007929816225261,\n\
\ \"mc2_stderr\": 0.017079917935026806\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.23208191126279865,\n \"acc_stderr\": 0.012336718284948854,\n\
\ \"acc_norm\": 0.27986348122866894,\n \"acc_norm_stderr\": 0.013119040897725922\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25851424019119695,\n\
\ \"acc_stderr\": 0.004369232540125881,\n \"acc_norm\": 0.2606054570802629,\n\
\ \"acc_norm_stderr\": 0.004380678585341419\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653695,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653695\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\
\ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\
\ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n\
\ \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n \
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.02725726032249485,\n\
\ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.02725726032249485\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.03186209851641143,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.03186209851641143\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.18723404255319148,\n \"acc_stderr\": 0.025501588341883603,\n\
\ \"acc_norm\": 0.18723404255319148,\n \"acc_norm_stderr\": 0.025501588341883603\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.16551724137931034,\n \"acc_stderr\": 0.030970559966224085,\n\
\ \"acc_norm\": 0.16551724137931034,\n \"acc_norm_stderr\": 0.030970559966224085\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643895,\n \"\
acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643895\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23548387096774193,\n\
\ \"acc_stderr\": 0.024137632429337707,\n \"acc_norm\": 0.23548387096774193,\n\
\ \"acc_norm_stderr\": 0.024137632429337707\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.18719211822660098,\n \"acc_stderr\": 0.027444924966882618,\n\
\ \"acc_norm\": 0.18719211822660098,\n \"acc_norm_stderr\": 0.027444924966882618\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.23737373737373738,\n \"acc_stderr\": 0.030313710538198892,\n \"\
acc_norm\": 0.23737373737373738,\n \"acc_norm_stderr\": 0.030313710538198892\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.24352331606217617,\n \"acc_stderr\": 0.030975436386845436,\n\
\ \"acc_norm\": 0.24352331606217617,\n \"acc_norm_stderr\": 0.030975436386845436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.30256410256410254,\n \"acc_stderr\": 0.02329088805377272,\n\
\ \"acc_norm\": 0.30256410256410254,\n \"acc_norm_stderr\": 0.02329088805377272\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2,\n \"acc_stderr\": 0.02438843043398766,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.02438843043398766\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.029597329730978082,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.029597329730978082\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.036313298039696545,\n \"\
acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696545\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.26788990825688075,\n \"acc_stderr\": 0.01898746225797865,\n \"\
acc_norm\": 0.26788990825688075,\n \"acc_norm_stderr\": 0.01898746225797865\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.23148148148148148,\n \"acc_stderr\": 0.02876511171804696,\n \"\
acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.02876511171804696\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23529411764705882,\n \"acc_stderr\": 0.02977177522814563,\n \"\
acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02977177522814563\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.21518987341772153,\n \"acc_stderr\": 0.026750826994676177,\n \
\ \"acc_norm\": 0.21518987341772153,\n \"acc_norm_stderr\": 0.026750826994676177\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.19282511210762332,\n\
\ \"acc_stderr\": 0.02647824096048936,\n \"acc_norm\": 0.19282511210762332,\n\
\ \"acc_norm_stderr\": 0.02647824096048936\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.19008264462809918,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.19008264462809918,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.041331194402438404,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.041331194402438404\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.03259177392742177,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.03259177392742177\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.042450224863844935,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.042450224863844935\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23504273504273504,\n\
\ \"acc_stderr\": 0.027778835904935427,\n \"acc_norm\": 0.23504273504273504,\n\
\ \"acc_norm_stderr\": 0.027778835904935427\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.21328224776500637,\n\
\ \"acc_stderr\": 0.014648172749593522,\n \"acc_norm\": 0.21328224776500637,\n\
\ \"acc_norm_stderr\": 0.014648172749593522\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23410404624277456,\n \"acc_stderr\": 0.022797110278071134,\n\
\ \"acc_norm\": 0.23410404624277456,\n \"acc_norm_stderr\": 0.022797110278071134\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n\
\ \"acc_stderr\": 0.014444157808261466,\n \"acc_norm\": 0.24804469273743016,\n\
\ \"acc_norm_stderr\": 0.014444157808261466\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.02463004897982478,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.02463004897982478\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.21221864951768488,\n\
\ \"acc_stderr\": 0.023222756797435122,\n \"acc_norm\": 0.21221864951768488,\n\
\ \"acc_norm_stderr\": 0.023222756797435122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25308641975308643,\n \"acc_stderr\": 0.024191808600713006,\n\
\ \"acc_norm\": 0.25308641975308643,\n \"acc_norm_stderr\": 0.024191808600713006\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180848,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2529335071707953,\n\
\ \"acc_stderr\": 0.01110226871383999,\n \"acc_norm\": 0.2529335071707953,\n\
\ \"acc_norm_stderr\": 0.01110226871383999\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2610294117647059,\n \"acc_stderr\": 0.02667925227010313,\n\
\ \"acc_norm\": 0.2610294117647059,\n \"acc_norm_stderr\": 0.02667925227010313\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25326797385620914,\n \"acc_stderr\": 0.017593486895366835,\n \
\ \"acc_norm\": 0.25326797385620914,\n \"acc_norm_stderr\": 0.017593486895366835\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.03895091015724138,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.03895091015724138\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2897959183673469,\n \"acc_stderr\": 0.029043088683304342,\n\
\ \"acc_norm\": 0.2897959183673469,\n \"acc_norm_stderr\": 0.029043088683304342\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409217,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409217\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25301204819277107,\n\
\ \"acc_stderr\": 0.03384429155233134,\n \"acc_norm\": 0.25301204819277107,\n\
\ \"acc_norm_stderr\": 0.03384429155233134\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.031267817146631786,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.031267817146631786\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931598,\n \"mc2\": 0.5007929816225261,\n\
\ \"mc2_stderr\": 0.017079917935026806\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/openchat_v2_openorca_preview-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|arc:challenge|25_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hellaswag|10_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T11:30:59.875390.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T11:30:59.875390.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T11_30_59.875390
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T11:30:59.875390.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T11:30:59.875390.parquet'
---
# Dataset Card for Evaluation run of TheBloke/openchat_v2_openorca_preview-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/openchat_v2_openorca_preview-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/openchat_v2_openorca_preview-GPTQ](https://huggingface.co/TheBloke/openchat_v2_openorca_preview-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__openchat_v2_openorca_preview-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-22T11:30:59.875390](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__openchat_v2_openorca_preview-GPTQ/blob/main/results_2023-08-22T11%3A30%3A59.875390.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24246694991848233,
"acc_stderr": 0.03117174175399139,
"acc_norm": 0.2433122513905999,
"acc_norm_stderr": 0.03118519546005906,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931598,
"mc2": 0.5007929816225261,
"mc2_stderr": 0.017079917935026806
},
"harness|arc:challenge|25": {
"acc": 0.23208191126279865,
"acc_stderr": 0.012336718284948854,
"acc_norm": 0.27986348122866894,
"acc_norm_stderr": 0.013119040897725922
},
"harness|hellaswag|10": {
"acc": 0.25851424019119695,
"acc_stderr": 0.004369232540125881,
"acc_norm": 0.2606054570802629,
"acc_norm_stderr": 0.004380678585341419
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.02725726032249485,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.02725726032249485
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641143,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641143
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062947,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062947
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.18723404255319148,
"acc_stderr": 0.025501588341883603,
"acc_norm": 0.18723404255319148,
"acc_norm_stderr": 0.025501588341883603
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.16551724137931034,
"acc_stderr": 0.030970559966224085,
"acc_norm": 0.16551724137931034,
"acc_norm_stderr": 0.030970559966224085
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643895,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643895
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23548387096774193,
"acc_stderr": 0.024137632429337707,
"acc_norm": 0.23548387096774193,
"acc_norm_stderr": 0.024137632429337707
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.18719211822660098,
"acc_stderr": 0.027444924966882618,
"acc_norm": 0.18719211822660098,
"acc_norm_stderr": 0.027444924966882618
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23737373737373738,
"acc_stderr": 0.030313710538198892,
"acc_norm": 0.23737373737373738,
"acc_norm_stderr": 0.030313710538198892
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24352331606217617,
"acc_stderr": 0.030975436386845436,
"acc_norm": 0.24352331606217617,
"acc_norm_stderr": 0.030975436386845436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.30256410256410254,
"acc_stderr": 0.02329088805377272,
"acc_norm": 0.30256410256410254,
"acc_norm_stderr": 0.02329088805377272
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2,
"acc_stderr": 0.02438843043398766,
"acc_norm": 0.2,
"acc_norm_stderr": 0.02438843043398766
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.029597329730978082,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.029597329730978082
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.036313298039696545,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.036313298039696545
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26788990825688075,
"acc_stderr": 0.01898746225797865,
"acc_norm": 0.26788990825688075,
"acc_norm_stderr": 0.01898746225797865
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.02876511171804696,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.02876511171804696
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.02977177522814563,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.02977177522814563
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.21518987341772153,
"acc_stderr": 0.026750826994676177,
"acc_norm": 0.21518987341772153,
"acc_norm_stderr": 0.026750826994676177
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.19282511210762332,
"acc_stderr": 0.02647824096048936,
"acc_norm": 0.19282511210762332,
"acc_norm_stderr": 0.02647824096048936
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.19008264462809918,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.19008264462809918,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.041331194402438404,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.041331194402438404
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.03259177392742177,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.03259177392742177
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.042450224863844935,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.042450224863844935
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23504273504273504,
"acc_stderr": 0.027778835904935427,
"acc_norm": 0.23504273504273504,
"acc_norm_stderr": 0.027778835904935427
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.21328224776500637,
"acc_stderr": 0.014648172749593522,
"acc_norm": 0.21328224776500637,
"acc_norm_stderr": 0.014648172749593522
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23410404624277456,
"acc_stderr": 0.022797110278071134,
"acc_norm": 0.23410404624277456,
"acc_norm_stderr": 0.022797110278071134
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24804469273743016,
"acc_stderr": 0.014444157808261466,
"acc_norm": 0.24804469273743016,
"acc_norm_stderr": 0.014444157808261466
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.21221864951768488,
"acc_stderr": 0.023222756797435122,
"acc_norm": 0.21221864951768488,
"acc_norm_stderr": 0.023222756797435122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25308641975308643,
"acc_stderr": 0.024191808600713006,
"acc_norm": 0.25308641975308643,
"acc_norm_stderr": 0.024191808600713006
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180848,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2529335071707953,
"acc_stderr": 0.01110226871383999,
"acc_norm": 0.2529335071707953,
"acc_norm_stderr": 0.01110226871383999
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2610294117647059,
"acc_stderr": 0.02667925227010313,
"acc_norm": 0.2610294117647059,
"acc_norm_stderr": 0.02667925227010313
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25326797385620914,
"acc_stderr": 0.017593486895366835,
"acc_norm": 0.25326797385620914,
"acc_norm_stderr": 0.017593486895366835
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.03895091015724138,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.03895091015724138
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2897959183673469,
"acc_stderr": 0.029043088683304342,
"acc_norm": 0.2897959183673469,
"acc_norm_stderr": 0.029043088683304342
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409217,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409217
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25301204819277107,
"acc_stderr": 0.03384429155233134,
"acc_norm": 0.25301204819277107,
"acc_norm_stderr": 0.03384429155233134
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931598,
"mc2": 0.5007929816225261,
"mc2_stderr": 0.017079917935026806
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TheBloke__robin-33B-v2-GPTQ | 2023-08-27T12:45:09.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/robin-33B-v2-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/robin-33B-v2-GPTQ](https://huggingface.co/TheBloke/robin-33B-v2-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__robin-33B-v2-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-22T13:23:21.800878](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__robin-33B-v2-GPTQ/blob/main/results_2023-08-22T13%3A23%3A21.800878.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23552388556599335,\n\
\ \"acc_stderr\": 0.030915991946675134,\n \"acc_norm\": 0.23644490880538102,\n\
\ \"acc_norm_stderr\": 0.030929664385254164,\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570345,\n \"mc2\": 0.49536982988840905,\n\
\ \"mc2_stderr\": 0.016949260989828546\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.23122866894197952,\n \"acc_stderr\": 0.012320858834772278,\n\
\ \"acc_norm\": 0.2773037542662116,\n \"acc_norm_stderr\": 0.013082095839059374\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2546305516829317,\n\
\ \"acc_stderr\": 0.0043476298890409385,\n \"acc_norm\": 0.26289583748257317,\n\
\ \"acc_norm_stderr\": 0.004393066760916822\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653695,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653695\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\
\ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\
\ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.23026315789473684,\n \"acc_stderr\": 0.034260594244031654,\n\
\ \"acc_norm\": 0.23026315789473684,\n \"acc_norm_stderr\": 0.034260594244031654\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.28,\n\
\ \"acc_stderr\": 0.04512608598542126,\n \"acc_norm\": 0.28,\n \
\ \"acc_norm_stderr\": 0.04512608598542126\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.02713429162874172,\n\
\ \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.02713429162874172\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2658959537572254,\n\
\ \"acc_stderr\": 0.033687629322594295,\n \"acc_norm\": 0.2658959537572254,\n\
\ \"acc_norm_stderr\": 0.033687629322594295\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.18723404255319148,\n \"acc_stderr\": 0.025501588341883603,\n\
\ \"acc_norm\": 0.18723404255319148,\n \"acc_norm_stderr\": 0.025501588341883603\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.1724137931034483,\n \"acc_stderr\": 0.03147830790259574,\n\
\ \"acc_norm\": 0.1724137931034483,\n \"acc_norm_stderr\": 0.03147830790259574\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23809523809523808,\n \"acc_stderr\": 0.021935878081184756,\n \"\
acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.021935878081184756\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.037649508797906045,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.037649508797906045\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22903225806451613,\n\
\ \"acc_stderr\": 0.023904914311782658,\n \"acc_norm\": 0.22903225806451613,\n\
\ \"acc_norm_stderr\": 0.023904914311782658\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.1921182266009852,\n \"acc_stderr\": 0.02771931570961477,\n\
\ \"acc_norm\": 0.1921182266009852,\n \"acc_norm_stderr\": 0.02771931570961477\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322674,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.041633319989322674\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.23737373737373738,\n \"acc_stderr\": 0.030313710538198892,\n \"\
acc_norm\": 0.23737373737373738,\n \"acc_norm_stderr\": 0.030313710538198892\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2538860103626943,\n \"acc_stderr\": 0.03141024780565319,\n\
\ \"acc_norm\": 0.2538860103626943,\n \"acc_norm_stderr\": 0.03141024780565319\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.29743589743589743,\n \"acc_stderr\": 0.02317740813146595,\n\
\ \"acc_norm\": 0.29743589743589743,\n \"acc_norm_stderr\": 0.02317740813146595\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2074074074074074,\n \"acc_stderr\": 0.024720713193952158,\n \
\ \"acc_norm\": 0.2074074074074074,\n \"acc_norm_stderr\": 0.024720713193952158\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.029597329730978082,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.029597329730978082\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.03511807571804725,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.03511807571804725\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.27706422018348625,\n \"acc_stderr\": 0.01918848259016954,\n \"\
acc_norm\": 0.27706422018348625,\n \"acc_norm_stderr\": 0.01918848259016954\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.17592592592592593,\n \"acc_stderr\": 0.025967420958258533,\n \"\
acc_norm\": 0.17592592592592593,\n \"acc_norm_stderr\": 0.025967420958258533\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23529411764705882,\n \"acc_stderr\": 0.029771775228145638,\n \"\
acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.029771775228145638\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.21518987341772153,\n \"acc_stderr\": 0.02675082699467617,\n \
\ \"acc_norm\": 0.21518987341772153,\n \"acc_norm_stderr\": 0.02675082699467617\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.19282511210762332,\n\
\ \"acc_stderr\": 0.02647824096048936,\n \"acc_norm\": 0.19282511210762332,\n\
\ \"acc_norm_stderr\": 0.02647824096048936\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2809917355371901,\n \"acc_stderr\": 0.041032038305145124,\n \"\
acc_norm\": 0.2809917355371901,\n \"acc_norm_stderr\": 0.041032038305145124\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.041331194402438404,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.041331194402438404\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.032910995786157686,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.032910995786157686\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467764,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467764\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.042450224863844935,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.042450224863844935\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23504273504273504,\n\
\ \"acc_stderr\": 0.027778835904935427,\n \"acc_norm\": 0.23504273504273504,\n\
\ \"acc_norm_stderr\": 0.027778835904935427\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.21328224776500637,\n\
\ \"acc_stderr\": 0.014648172749593522,\n \"acc_norm\": 0.21328224776500637,\n\
\ \"acc_norm_stderr\": 0.014648172749593522\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2398843930635838,\n \"acc_stderr\": 0.022989592543123563,\n\
\ \"acc_norm\": 0.2398843930635838,\n \"acc_norm_stderr\": 0.022989592543123563\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2335195530726257,\n\
\ \"acc_stderr\": 0.01414957534897625,\n \"acc_norm\": 0.2335195530726257,\n\
\ \"acc_norm_stderr\": 0.01414957534897625\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.02392915551735129,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.02392915551735129\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.21221864951768488,\n\
\ \"acc_stderr\": 0.023222756797435122,\n \"acc_norm\": 0.21221864951768488,\n\
\ \"acc_norm_stderr\": 0.023222756797435122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.19135802469135801,\n \"acc_stderr\": 0.02188770461339615,\n\
\ \"acc_norm\": 0.19135802469135801,\n \"acc_norm_stderr\": 0.02188770461339615\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432403,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432403\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25358539765319427,\n\
\ \"acc_stderr\": 0.011111715336101132,\n \"acc_norm\": 0.25358539765319427,\n\
\ \"acc_norm_stderr\": 0.011111715336101132\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.21691176470588236,\n \"acc_stderr\": 0.02503584522771127,\n\
\ \"acc_norm\": 0.21691176470588236,\n \"acc_norm_stderr\": 0.02503584522771127\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24509803921568626,\n \"acc_stderr\": 0.017401816711427657,\n \
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.017401816711427657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17551020408163265,\n \"acc_stderr\": 0.024352800722970018,\n\
\ \"acc_norm\": 0.17551020408163265,\n \"acc_norm_stderr\": 0.024352800722970018\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2469879518072289,\n\
\ \"acc_stderr\": 0.03357351982064537,\n \"acc_norm\": 0.2469879518072289,\n\
\ \"acc_norm_stderr\": 0.03357351982064537\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.031267817146631786,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.031267817146631786\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570345,\n \"mc2\": 0.49536982988840905,\n\
\ \"mc2_stderr\": 0.016949260989828546\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/robin-33B-v2-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|arc:challenge|25_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hellaswag|10_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:23:21.800878.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:23:21.800878.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T13_23_21.800878
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T13:23:21.800878.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T13:23:21.800878.parquet'
---
# Dataset Card for Evaluation run of TheBloke/robin-33B-v2-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/robin-33B-v2-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/robin-33B-v2-GPTQ](https://huggingface.co/TheBloke/robin-33B-v2-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__robin-33B-v2-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-22T13:23:21.800878](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__robin-33B-v2-GPTQ/blob/main/results_2023-08-22T13%3A23%3A21.800878.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23552388556599335,
"acc_stderr": 0.030915991946675134,
"acc_norm": 0.23644490880538102,
"acc_norm_stderr": 0.030929664385254164,
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570345,
"mc2": 0.49536982988840905,
"mc2_stderr": 0.016949260989828546
},
"harness|arc:challenge|25": {
"acc": 0.23122866894197952,
"acc_stderr": 0.012320858834772278,
"acc_norm": 0.2773037542662116,
"acc_norm_stderr": 0.013082095839059374
},
"harness|hellaswag|10": {
"acc": 0.2546305516829317,
"acc_stderr": 0.0043476298890409385,
"acc_norm": 0.26289583748257317,
"acc_norm_stderr": 0.004393066760916822
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.23026315789473684,
"acc_stderr": 0.034260594244031654,
"acc_norm": 0.23026315789473684,
"acc_norm_stderr": 0.034260594244031654
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2641509433962264,
"acc_stderr": 0.02713429162874172,
"acc_norm": 0.2641509433962264,
"acc_norm_stderr": 0.02713429162874172
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.033687629322594295,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.033687629322594295
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.18723404255319148,
"acc_stderr": 0.025501588341883603,
"acc_norm": 0.18723404255319148,
"acc_norm_stderr": 0.025501588341883603
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.1724137931034483,
"acc_stderr": 0.03147830790259574,
"acc_norm": 0.1724137931034483,
"acc_norm_stderr": 0.03147830790259574
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.021935878081184756,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.021935878081184756
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.037649508797906045,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.037649508797906045
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22903225806451613,
"acc_stderr": 0.023904914311782658,
"acc_norm": 0.22903225806451613,
"acc_norm_stderr": 0.023904914311782658
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.1921182266009852,
"acc_stderr": 0.02771931570961477,
"acc_norm": 0.1921182266009852,
"acc_norm_stderr": 0.02771931570961477
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322674,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322674
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23737373737373738,
"acc_stderr": 0.030313710538198892,
"acc_norm": 0.23737373737373738,
"acc_norm_stderr": 0.030313710538198892
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2538860103626943,
"acc_stderr": 0.03141024780565319,
"acc_norm": 0.2538860103626943,
"acc_norm_stderr": 0.03141024780565319
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.29743589743589743,
"acc_stderr": 0.02317740813146595,
"acc_norm": 0.29743589743589743,
"acc_norm_stderr": 0.02317740813146595
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2074074074074074,
"acc_stderr": 0.024720713193952158,
"acc_norm": 0.2074074074074074,
"acc_norm_stderr": 0.024720713193952158
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.029597329730978082,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.029597329730978082
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.03511807571804725,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.03511807571804725
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.27706422018348625,
"acc_stderr": 0.01918848259016954,
"acc_norm": 0.27706422018348625,
"acc_norm_stderr": 0.01918848259016954
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.17592592592592593,
"acc_stderr": 0.025967420958258533,
"acc_norm": 0.17592592592592593,
"acc_norm_stderr": 0.025967420958258533
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.029771775228145638,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.029771775228145638
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.21518987341772153,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.21518987341772153,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.19282511210762332,
"acc_stderr": 0.02647824096048936,
"acc_norm": 0.19282511210762332,
"acc_norm_stderr": 0.02647824096048936
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2809917355371901,
"acc_stderr": 0.041032038305145124,
"acc_norm": 0.2809917355371901,
"acc_norm_stderr": 0.041032038305145124
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.041331194402438404,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.041331194402438404
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467764,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467764
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.042450224863844935,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.042450224863844935
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23504273504273504,
"acc_stderr": 0.027778835904935427,
"acc_norm": 0.23504273504273504,
"acc_norm_stderr": 0.027778835904935427
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.21328224776500637,
"acc_stderr": 0.014648172749593522,
"acc_norm": 0.21328224776500637,
"acc_norm_stderr": 0.014648172749593522
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2398843930635838,
"acc_stderr": 0.022989592543123563,
"acc_norm": 0.2398843930635838,
"acc_norm_stderr": 0.022989592543123563
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2335195530726257,
"acc_stderr": 0.01414957534897625,
"acc_norm": 0.2335195530726257,
"acc_norm_stderr": 0.01414957534897625
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.02392915551735129,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.02392915551735129
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.21221864951768488,
"acc_stderr": 0.023222756797435122,
"acc_norm": 0.21221864951768488,
"acc_norm_stderr": 0.023222756797435122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.19135802469135801,
"acc_stderr": 0.02188770461339615,
"acc_norm": 0.19135802469135801,
"acc_norm_stderr": 0.02188770461339615
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432403,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432403
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25358539765319427,
"acc_stderr": 0.011111715336101132,
"acc_norm": 0.25358539765319427,
"acc_norm_stderr": 0.011111715336101132
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.21691176470588236,
"acc_stderr": 0.02503584522771127,
"acc_norm": 0.21691176470588236,
"acc_norm_stderr": 0.02503584522771127
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.017401816711427657,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.017401816711427657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17551020408163265,
"acc_stderr": 0.024352800722970018,
"acc_norm": 0.17551020408163265,
"acc_norm_stderr": 0.024352800722970018
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2469879518072289,
"acc_stderr": 0.03357351982064537,
"acc_norm": 0.2469879518072289,
"acc_norm_stderr": 0.03357351982064537
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570345,
"mc2": 0.49536982988840905,
"mc2_stderr": 0.016949260989828546
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_bigcode-data__slimpajama-1.3b | 2023-08-27T12:45:11.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of bigcode-data/slimpajama-1.3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bigcode-data/slimpajama-1.3b](https://huggingface.co/bigcode-data/slimpajama-1.3b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bigcode-data__slimpajama-1.3b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-27T12:00:35.833850](https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode-data__slimpajama-1.3b/blob/main/results_2023-08-27T12%3A00%3A35.833850.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26777696601577794,\n\
\ \"acc_stderr\": 0.03181071981671739,\n \"acc_norm\": 0.2705315418772776,\n\
\ \"acc_norm_stderr\": 0.0318183562058576,\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.014896277441041834,\n \"mc2\": 0.3709121363542238,\n\
\ \"mc2_stderr\": 0.013878840992863677\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2781569965870307,\n \"acc_stderr\": 0.013094469919538805,\n\
\ \"acc_norm\": 0.3097269624573379,\n \"acc_norm_stderr\": 0.013512058415238361\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.42202748456482775,\n\
\ \"acc_stderr\": 0.0049287351036358335,\n \"acc_norm\": 0.5529774945230034,\n\
\ \"acc_norm_stderr\": 0.004961693567208826\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.033176727875331574,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.033176727875331574\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.16,\n\
\ \"acc_stderr\": 0.0368452949177471,\n \"acc_norm\": 0.16,\n \
\ \"acc_norm_stderr\": 0.0368452949177471\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2339622641509434,\n \"acc_stderr\": 0.02605529690115292,\n\
\ \"acc_norm\": 0.2339622641509434,\n \"acc_norm_stderr\": 0.02605529690115292\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2832369942196532,\n\
\ \"acc_stderr\": 0.034355680560478746,\n \"acc_norm\": 0.2832369942196532,\n\
\ \"acc_norm_stderr\": 0.034355680560478746\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.23829787234042554,\n \"acc_stderr\": 0.02785125297388979,\n\
\ \"acc_norm\": 0.23829787234042554,\n \"acc_norm_stderr\": 0.02785125297388979\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2804232804232804,\n \"acc_stderr\": 0.02313528797432563,\n \"\
acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.02313528797432563\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471255,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471255\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.31290322580645163,\n\
\ \"acc_stderr\": 0.026377567028645854,\n \"acc_norm\": 0.31290322580645163,\n\
\ \"acc_norm_stderr\": 0.026377567028645854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.033464098810559534,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.033464098810559534\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3282828282828283,\n \"acc_stderr\": 0.03345678422756776,\n \"\
acc_norm\": 0.3282828282828283,\n \"acc_norm_stderr\": 0.03345678422756776\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2849740932642487,\n \"acc_stderr\": 0.03257714077709661,\n\
\ \"acc_norm\": 0.2849740932642487,\n \"acc_norm_stderr\": 0.03257714077709661\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.35384615384615387,\n \"acc_stderr\": 0.02424378399406217,\n\
\ \"acc_norm\": 0.35384615384615387,\n \"acc_norm_stderr\": 0.02424378399406217\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507383,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507383\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31512605042016806,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.31512605042016806,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.26788990825688075,\n \"acc_stderr\": 0.018987462257978652,\n \"\
acc_norm\": 0.26788990825688075,\n \"acc_norm_stderr\": 0.018987462257978652\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24019607843137256,\n\
\ \"acc_stderr\": 0.02998373305591362,\n \"acc_norm\": 0.24019607843137256,\n\
\ \"acc_norm_stderr\": 0.02998373305591362\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n\
\ \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.15695067264573992,\n\
\ \"acc_stderr\": 0.0244135871749074,\n \"acc_norm\": 0.15695067264573992,\n\
\ \"acc_norm_stderr\": 0.0244135871749074\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.03727673575596918,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.03727673575596918\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794088,\n \"\
acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794088\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2037037037037037,\n\
\ \"acc_stderr\": 0.03893542518824848,\n \"acc_norm\": 0.2037037037037037,\n\
\ \"acc_norm_stderr\": 0.03893542518824848\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.31901840490797545,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.31901840490797545,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.17857142857142858,\n\
\ \"acc_stderr\": 0.036352091215778065,\n \"acc_norm\": 0.17857142857142858,\n\
\ \"acc_norm_stderr\": 0.036352091215778065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.33980582524271846,\n \"acc_stderr\": 0.04689765937278135,\n\
\ \"acc_norm\": 0.33980582524271846,\n \"acc_norm_stderr\": 0.04689765937278135\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.20512820512820512,\n\
\ \"acc_stderr\": 0.026453508054040332,\n \"acc_norm\": 0.20512820512820512,\n\
\ \"acc_norm_stderr\": 0.026453508054040332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2413793103448276,\n\
\ \"acc_stderr\": 0.01530238012354209,\n \"acc_norm\": 0.2413793103448276,\n\
\ \"acc_norm_stderr\": 0.01530238012354209\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.20520231213872833,\n \"acc_stderr\": 0.021742519835276277,\n\
\ \"acc_norm\": 0.20520231213872833,\n \"acc_norm_stderr\": 0.021742519835276277\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.238562091503268,\n \"acc_stderr\": 0.024404394928087873,\n\
\ \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.024404394928087873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2572347266881029,\n\
\ \"acc_stderr\": 0.024826171289250888,\n \"acc_norm\": 0.2572347266881029,\n\
\ \"acc_norm_stderr\": 0.024826171289250888\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967273,\n\
\ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967273\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307854,\n \
\ \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307854\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25684485006518903,\n\
\ \"acc_stderr\": 0.01115845585309886,\n \"acc_norm\": 0.25684485006518903,\n\
\ \"acc_norm_stderr\": 0.01115845585309886\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329376,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329376\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.20915032679738563,\n \"acc_stderr\": 0.016453399332279326,\n \
\ \"acc_norm\": 0.20915032679738563,\n \"acc_norm_stderr\": 0.016453399332279326\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n\
\ \"acc_stderr\": 0.04122066502878284,\n \"acc_norm\": 0.24545454545454545,\n\
\ \"acc_norm_stderr\": 0.04122066502878284\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.025409301953225678,\n\
\ \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.025409301953225678\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.19900497512437812,\n\
\ \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.19900497512437812,\n\
\ \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.22289156626506024,\n\
\ \"acc_stderr\": 0.032400048255946876,\n \"acc_norm\": 0.22289156626506024,\n\
\ \"acc_norm_stderr\": 0.032400048255946876\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.32748538011695905,\n \"acc_stderr\": 0.035993357714560276,\n\
\ \"acc_norm\": 0.32748538011695905,\n \"acc_norm_stderr\": 0.035993357714560276\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.014896277441041834,\n \"mc2\": 0.3709121363542238,\n\
\ \"mc2_stderr\": 0.013878840992863677\n }\n}\n```"
repo_url: https://huggingface.co/bigcode-data/slimpajama-1.3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|arc:challenge|25_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hellaswag|10_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-27T12:00:35.833850.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-27T12:00:35.833850.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-27T12:00:35.833850.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-27T12:00:35.833850.parquet'
- config_name: results
data_files:
- split: 2023_08_27T12_00_35.833850
path:
- results_2023-08-27T12:00:35.833850.parquet
- split: latest
path:
- results_2023-08-27T12:00:35.833850.parquet
---
# Dataset Card for Evaluation run of bigcode-data/slimpajama-1.3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bigcode-data/slimpajama-1.3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [bigcode-data/slimpajama-1.3b](https://huggingface.co/bigcode-data/slimpajama-1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bigcode-data__slimpajama-1.3b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-27T12:00:35.833850](https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode-data__slimpajama-1.3b/blob/main/results_2023-08-27T12%3A00%3A35.833850.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26777696601577794,
"acc_stderr": 0.03181071981671739,
"acc_norm": 0.2705315418772776,
"acc_norm_stderr": 0.0318183562058576,
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041834,
"mc2": 0.3709121363542238,
"mc2_stderr": 0.013878840992863677
},
"harness|arc:challenge|25": {
"acc": 0.2781569965870307,
"acc_stderr": 0.013094469919538805,
"acc_norm": 0.3097269624573379,
"acc_norm_stderr": 0.013512058415238361
},
"harness|hellaswag|10": {
"acc": 0.42202748456482775,
"acc_stderr": 0.0049287351036358335,
"acc_norm": 0.5529774945230034,
"acc_norm_stderr": 0.004961693567208826
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03944624162501116,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03944624162501116
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.033176727875331574,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.033176727875331574
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.16,
"acc_stderr": 0.0368452949177471,
"acc_norm": 0.16,
"acc_norm_stderr": 0.0368452949177471
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2339622641509434,
"acc_stderr": 0.02605529690115292,
"acc_norm": 0.2339622641509434,
"acc_norm_stderr": 0.02605529690115292
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.034355680560478746,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.034355680560478746
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.23829787234042554,
"acc_stderr": 0.02785125297388979,
"acc_norm": 0.23829787234042554,
"acc_norm_stderr": 0.02785125297388979
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.02313528797432563,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.02313528797432563
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471255,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.31290322580645163,
"acc_stderr": 0.026377567028645854,
"acc_norm": 0.31290322580645163,
"acc_norm_stderr": 0.026377567028645854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.033464098810559534,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.033464098810559534
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3282828282828283,
"acc_stderr": 0.03345678422756776,
"acc_norm": 0.3282828282828283,
"acc_norm_stderr": 0.03345678422756776
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2849740932642487,
"acc_stderr": 0.03257714077709661,
"acc_norm": 0.2849740932642487,
"acc_norm_stderr": 0.03257714077709661
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.35384615384615387,
"acc_stderr": 0.02424378399406217,
"acc_norm": 0.35384615384615387,
"acc_norm_stderr": 0.02424378399406217
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507383,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507383
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31512605042016806,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.31512605042016806,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26788990825688075,
"acc_stderr": 0.018987462257978652,
"acc_norm": 0.26788990825688075,
"acc_norm_stderr": 0.018987462257978652
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2489451476793249,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.2489451476793249,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.15695067264573992,
"acc_stderr": 0.0244135871749074,
"acc_norm": 0.15695067264573992,
"acc_norm_stderr": 0.0244135871749074
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.03727673575596918,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.03727673575596918
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.03893542518824848,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.03893542518824848
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.31901840490797545,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.31901840490797545,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.17857142857142858,
"acc_stderr": 0.036352091215778065,
"acc_norm": 0.17857142857142858,
"acc_norm_stderr": 0.036352091215778065
},
"harness|hendrycksTest-management|5": {
"acc": 0.33980582524271846,
"acc_stderr": 0.04689765937278135,
"acc_norm": 0.33980582524271846,
"acc_norm_stderr": 0.04689765937278135
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.20512820512820512,
"acc_stderr": 0.026453508054040332,
"acc_norm": 0.20512820512820512,
"acc_norm_stderr": 0.026453508054040332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.01530238012354209,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.01530238012354209
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.20520231213872833,
"acc_stderr": 0.021742519835276277,
"acc_norm": 0.20520231213872833,
"acc_norm_stderr": 0.021742519835276277
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.238562091503268,
"acc_stderr": 0.024404394928087873,
"acc_norm": 0.238562091503268,
"acc_norm_stderr": 0.024404394928087873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2572347266881029,
"acc_stderr": 0.024826171289250888,
"acc_norm": 0.2572347266881029,
"acc_norm_stderr": 0.024826171289250888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.024659685185967273,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.024659685185967273
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307854,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307854
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25684485006518903,
"acc_stderr": 0.01115845585309886,
"acc_norm": 0.25684485006518903,
"acc_norm_stderr": 0.01115845585309886
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.030187532060329376,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.030187532060329376
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.20915032679738563,
"acc_stderr": 0.016453399332279326,
"acc_norm": 0.20915032679738563,
"acc_norm_stderr": 0.016453399332279326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.04122066502878284,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.04122066502878284
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19591836734693877,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.19591836734693877,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.19900497512437812,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.19900497512437812,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.22289156626506024,
"acc_stderr": 0.032400048255946876,
"acc_norm": 0.22289156626506024,
"acc_norm_stderr": 0.032400048255946876
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.32748538011695905,
"acc_stderr": 0.035993357714560276,
"acc_norm": 0.32748538011695905,
"acc_norm_stderr": 0.035993357714560276
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041834,
"mc2": 0.3709121363542238,
"mc2_stderr": 0.013878840992863677
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_conceptofmind__LLongMA-2-7b-16k | 2023-08-27T12:45:13.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of conceptofmind/LLongMA-2-7b-16k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [conceptofmind/LLongMA-2-7b-16k](https://huggingface.co/conceptofmind/LLongMA-2-7b-16k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_conceptofmind__LLongMA-2-7b-16k\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-22T18:10:15.438623](https://huggingface.co/datasets/open-llm-leaderboard/details_conceptofmind__LLongMA-2-7b-16k/blob/main/results_2023-08-22T18%3A10%3A15.438623.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.38930534904833497,\n\
\ \"acc_stderr\": 0.03491718638238362,\n \"acc_norm\": 0.39334911896029395,\n\
\ \"acc_norm_stderr\": 0.03490530736130865,\n \"mc1\": 0.26805385556915545,\n\
\ \"mc1_stderr\": 0.015506204722834555,\n \"mc2\": 0.3905977414272289,\n\
\ \"mc2_stderr\": 0.01382437830936997\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4803754266211604,\n \"acc_stderr\": 0.014600132075947085,\n\
\ \"acc_norm\": 0.5221843003412969,\n \"acc_norm_stderr\": 0.014597001927076143\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5653256323441546,\n\
\ \"acc_stderr\": 0.004947010937455345,\n \"acc_norm\": 0.7620991834295957,\n\
\ \"acc_norm_stderr\": 0.004249278842903415\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3355263157894737,\n \"acc_stderr\": 0.038424985593952694,\n\
\ \"acc_norm\": 0.3355263157894737,\n \"acc_norm_stderr\": 0.038424985593952694\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.37735849056603776,\n \"acc_stderr\": 0.029832808114796,\n\
\ \"acc_norm\": 0.37735849056603776,\n \"acc_norm_stderr\": 0.029832808114796\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.32947976878612717,\n\
\ \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.32947976878612717,\n\
\ \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.0314108219759624,\n\
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.0314108219759624\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.040434618619167466,\n\
\ \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.040434618619167466\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.023068188848261114,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.023068188848261114\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.18253968253968253,\n\
\ \"acc_stderr\": 0.0345507101910215,\n \"acc_norm\": 0.18253968253968253,\n\
\ \"acc_norm_stderr\": 0.0345507101910215\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.36451612903225805,\n\
\ \"acc_stderr\": 0.02737987122994324,\n \"acc_norm\": 0.36451612903225805,\n\
\ \"acc_norm_stderr\": 0.02737987122994324\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.03144712581678242,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.03144712581678242\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.47878787878787876,\n \"acc_stderr\": 0.03900828913737302,\n\
\ \"acc_norm\": 0.47878787878787876,\n \"acc_norm_stderr\": 0.03900828913737302\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.03358618145732523,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03358618145732523\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5544041450777202,\n \"acc_stderr\": 0.03587014986075659,\n\
\ \"acc_norm\": 0.5544041450777202,\n \"acc_norm_stderr\": 0.03587014986075659\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.31794871794871793,\n \"acc_stderr\": 0.023610884308927865,\n\
\ \"acc_norm\": 0.31794871794871793,\n \"acc_norm_stderr\": 0.023610884308927865\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184407,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184407\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3067226890756303,\n \"acc_stderr\": 0.029953823891887037,\n\
\ \"acc_norm\": 0.3067226890756303,\n \"acc_norm_stderr\": 0.029953823891887037\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.43853211009174314,\n \"acc_stderr\": 0.021274713073954562,\n \"\
acc_norm\": 0.43853211009174314,\n \"acc_norm_stderr\": 0.021274713073954562\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3194444444444444,\n \"acc_stderr\": 0.0317987634217685,\n \"acc_norm\"\
: 0.3194444444444444,\n \"acc_norm_stderr\": 0.0317987634217685\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.03460228327239171,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.03460228327239171\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.48945147679324896,\n \"acc_stderr\": 0.032539983791662855,\n\
\ \"acc_norm\": 0.48945147679324896,\n \"acc_norm_stderr\": 0.032539983791662855\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.47533632286995514,\n\
\ \"acc_stderr\": 0.033516951676526276,\n \"acc_norm\": 0.47533632286995514,\n\
\ \"acc_norm_stderr\": 0.033516951676526276\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4122137404580153,\n \"acc_stderr\": 0.04317171194870255,\n\
\ \"acc_norm\": 0.4122137404580153,\n \"acc_norm_stderr\": 0.04317171194870255\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5785123966942148,\n \"acc_stderr\": 0.04507732278775087,\n \"\
acc_norm\": 0.5785123966942148,\n \"acc_norm_stderr\": 0.04507732278775087\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3611111111111111,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.3611111111111111,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.38650306748466257,\n \"acc_stderr\": 0.038258255488486076,\n\
\ \"acc_norm\": 0.38650306748466257,\n \"acc_norm_stderr\": 0.038258255488486076\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4174757281553398,\n \"acc_stderr\": 0.04882840548212238,\n\
\ \"acc_norm\": 0.4174757281553398,\n \"acc_norm_stderr\": 0.04882840548212238\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5042735042735043,\n\
\ \"acc_stderr\": 0.032754892643821316,\n \"acc_norm\": 0.5042735042735043,\n\
\ \"acc_norm_stderr\": 0.032754892643821316\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5031928480204342,\n\
\ \"acc_stderr\": 0.017879598945933082,\n \"acc_norm\": 0.5031928480204342,\n\
\ \"acc_norm_stderr\": 0.017879598945933082\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.44508670520231214,\n \"acc_stderr\": 0.02675625512966377,\n\
\ \"acc_norm\": 0.44508670520231214,\n \"acc_norm_stderr\": 0.02675625512966377\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n\
\ \"acc_stderr\": 0.014465893829859926,\n \"acc_norm\": 0.24916201117318434,\n\
\ \"acc_norm_stderr\": 0.014465893829859926\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.434640522875817,\n \"acc_stderr\": 0.028384256704883037,\n\
\ \"acc_norm\": 0.434640522875817,\n \"acc_norm_stderr\": 0.028384256704883037\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.43086816720257237,\n\
\ \"acc_stderr\": 0.028125340983972714,\n \"acc_norm\": 0.43086816720257237,\n\
\ \"acc_norm_stderr\": 0.028125340983972714\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.42901234567901236,\n \"acc_stderr\": 0.027538925613470867,\n\
\ \"acc_norm\": 0.42901234567901236,\n \"acc_norm_stderr\": 0.027538925613470867\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02812163604063989,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02812163604063989\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3070404172099087,\n\
\ \"acc_stderr\": 0.011780959114513764,\n \"acc_norm\": 0.3070404172099087,\n\
\ \"acc_norm_stderr\": 0.011780959114513764\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.40441176470588236,\n \"acc_stderr\": 0.02981263070156974,\n\
\ \"acc_norm\": 0.40441176470588236,\n \"acc_norm_stderr\": 0.02981263070156974\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.40522875816993464,\n \"acc_stderr\": 0.019861155193829173,\n \
\ \"acc_norm\": 0.40522875816993464,\n \"acc_norm_stderr\": 0.019861155193829173\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.031680911612338825,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.031680911612338825\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.48756218905472637,\n\
\ \"acc_stderr\": 0.0353443984853958,\n \"acc_norm\": 0.48756218905472637,\n\
\ \"acc_norm_stderr\": 0.0353443984853958\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n\
\ \"acc_stderr\": 0.037891344246115496,\n \"acc_norm\": 0.3855421686746988,\n\
\ \"acc_norm_stderr\": 0.037891344246115496\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.52046783625731,\n \"acc_stderr\": 0.038316105328219316,\n\
\ \"acc_norm\": 0.52046783625731,\n \"acc_norm_stderr\": 0.038316105328219316\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26805385556915545,\n\
\ \"mc1_stderr\": 0.015506204722834555,\n \"mc2\": 0.3905977414272289,\n\
\ \"mc2_stderr\": 0.01382437830936997\n }\n}\n```"
repo_url: https://huggingface.co/conceptofmind/LLongMA-2-7b-16k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|arc:challenge|25_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hellaswag|10_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T18:10:15.438623.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T18:10:15.438623.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T18_10_15.438623
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T18:10:15.438623.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T18:10:15.438623.parquet'
---
# Dataset Card for Evaluation run of conceptofmind/LLongMA-2-7b-16k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/conceptofmind/LLongMA-2-7b-16k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [conceptofmind/LLongMA-2-7b-16k](https://huggingface.co/conceptofmind/LLongMA-2-7b-16k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_conceptofmind__LLongMA-2-7b-16k",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-22T18:10:15.438623](https://huggingface.co/datasets/open-llm-leaderboard/details_conceptofmind__LLongMA-2-7b-16k/blob/main/results_2023-08-22T18%3A10%3A15.438623.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.38930534904833497,
"acc_stderr": 0.03491718638238362,
"acc_norm": 0.39334911896029395,
"acc_norm_stderr": 0.03490530736130865,
"mc1": 0.26805385556915545,
"mc1_stderr": 0.015506204722834555,
"mc2": 0.3905977414272289,
"mc2_stderr": 0.01382437830936997
},
"harness|arc:challenge|25": {
"acc": 0.4803754266211604,
"acc_stderr": 0.014600132075947085,
"acc_norm": 0.5221843003412969,
"acc_norm_stderr": 0.014597001927076143
},
"harness|hellaswag|10": {
"acc": 0.5653256323441546,
"acc_stderr": 0.004947010937455345,
"acc_norm": 0.7620991834295957,
"acc_norm_stderr": 0.004249278842903415
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3355263157894737,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.3355263157894737,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.37735849056603776,
"acc_stderr": 0.029832808114796,
"acc_norm": 0.37735849056603776,
"acc_norm_stderr": 0.029832808114796
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.375,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.32947976878612717,
"acc_stderr": 0.03583901754736411,
"acc_norm": 0.32947976878612717,
"acc_norm_stderr": 0.03583901754736411
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.0314108219759624,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.0314108219759624
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.040434618619167466,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.040434618619167466
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.023068188848261114,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.023068188848261114
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.18253968253968253,
"acc_stderr": 0.0345507101910215,
"acc_norm": 0.18253968253968253,
"acc_norm_stderr": 0.0345507101910215
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.36451612903225805,
"acc_stderr": 0.02737987122994324,
"acc_norm": 0.36451612903225805,
"acc_norm_stderr": 0.02737987122994324
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.03144712581678242,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.03144712581678242
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.47878787878787876,
"acc_stderr": 0.03900828913737302,
"acc_norm": 0.47878787878787876,
"acc_norm_stderr": 0.03900828913737302
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03358618145732523,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03358618145732523
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5544041450777202,
"acc_stderr": 0.03587014986075659,
"acc_norm": 0.5544041450777202,
"acc_norm_stderr": 0.03587014986075659
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.31794871794871793,
"acc_stderr": 0.023610884308927865,
"acc_norm": 0.31794871794871793,
"acc_norm_stderr": 0.023610884308927865
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.02659393910184407,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.02659393910184407
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3067226890756303,
"acc_stderr": 0.029953823891887037,
"acc_norm": 0.3067226890756303,
"acc_norm_stderr": 0.029953823891887037
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.43853211009174314,
"acc_stderr": 0.021274713073954562,
"acc_norm": 0.43853211009174314,
"acc_norm_stderr": 0.021274713073954562
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3194444444444444,
"acc_stderr": 0.0317987634217685,
"acc_norm": 0.3194444444444444,
"acc_norm_stderr": 0.0317987634217685
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03460228327239171,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03460228327239171
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.48945147679324896,
"acc_stderr": 0.032539983791662855,
"acc_norm": 0.48945147679324896,
"acc_norm_stderr": 0.032539983791662855
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.47533632286995514,
"acc_stderr": 0.033516951676526276,
"acc_norm": 0.47533632286995514,
"acc_norm_stderr": 0.033516951676526276
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4122137404580153,
"acc_stderr": 0.04317171194870255,
"acc_norm": 0.4122137404580153,
"acc_norm_stderr": 0.04317171194870255
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5785123966942148,
"acc_stderr": 0.04507732278775087,
"acc_norm": 0.5785123966942148,
"acc_norm_stderr": 0.04507732278775087
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.38650306748466257,
"acc_stderr": 0.038258255488486076,
"acc_norm": 0.38650306748466257,
"acc_norm_stderr": 0.038258255488486076
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.4174757281553398,
"acc_stderr": 0.04882840548212238,
"acc_norm": 0.4174757281553398,
"acc_norm_stderr": 0.04882840548212238
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5042735042735043,
"acc_stderr": 0.032754892643821316,
"acc_norm": 0.5042735042735043,
"acc_norm_stderr": 0.032754892643821316
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5031928480204342,
"acc_stderr": 0.017879598945933082,
"acc_norm": 0.5031928480204342,
"acc_norm_stderr": 0.017879598945933082
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.02675625512966377,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.02675625512966377
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.014465893829859926,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.014465893829859926
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.434640522875817,
"acc_stderr": 0.028384256704883037,
"acc_norm": 0.434640522875817,
"acc_norm_stderr": 0.028384256704883037
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.43086816720257237,
"acc_stderr": 0.028125340983972714,
"acc_norm": 0.43086816720257237,
"acc_norm_stderr": 0.028125340983972714
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.42901234567901236,
"acc_stderr": 0.027538925613470867,
"acc_norm": 0.42901234567901236,
"acc_norm_stderr": 0.027538925613470867
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02812163604063989,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02812163604063989
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3070404172099087,
"acc_stderr": 0.011780959114513764,
"acc_norm": 0.3070404172099087,
"acc_norm_stderr": 0.011780959114513764
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.40441176470588236,
"acc_stderr": 0.02981263070156974,
"acc_norm": 0.40441176470588236,
"acc_norm_stderr": 0.02981263070156974
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.40522875816993464,
"acc_stderr": 0.019861155193829173,
"acc_norm": 0.40522875816993464,
"acc_norm_stderr": 0.019861155193829173
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.4,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.031680911612338825,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.031680911612338825
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.48756218905472637,
"acc_stderr": 0.0353443984853958,
"acc_norm": 0.48756218905472637,
"acc_norm_stderr": 0.0353443984853958
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.037891344246115496,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.037891344246115496
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.52046783625731,
"acc_stderr": 0.038316105328219316,
"acc_norm": 0.52046783625731,
"acc_norm_stderr": 0.038316105328219316
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26805385556915545,
"mc1_stderr": 0.015506204722834555,
"mc2": 0.3905977414272289,
"mc2_stderr": 0.01382437830936997
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_bofenghuang__vigogne-2-7b-chat | 2023-09-22T13:35:54.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of bofenghuang/vigogne-2-7b-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bofenghuang/vigogne-2-7b-chat](https://huggingface.co/bofenghuang/vigogne-2-7b-chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bofenghuang__vigogne-2-7b-chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T13:35:42.061271](https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-2-7b-chat/blob/main/results_2023-09-22T13-35-42.061271.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2779991610738255,\n\
\ \"em_stderr\": 0.0045880722162316605,\n \"f1\": 0.32825188758389273,\n\
\ \"f1_stderr\": 0.004516960799751206,\n \"acc\": 0.41080456661279235,\n\
\ \"acc_stderr\": 0.00980948368433141\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2779991610738255,\n \"em_stderr\": 0.0045880722162316605,\n\
\ \"f1\": 0.32825188758389273,\n \"f1_stderr\": 0.004516960799751206\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07733131159969674,\n \
\ \"acc_stderr\": 0.007357713523222347\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.744277821625888,\n \"acc_stderr\": 0.012261253845440473\n\
\ }\n}\n```"
repo_url: https://huggingface.co/bofenghuang/vigogne-2-7b-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|arc:challenge|25_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T13_35_42.061271
path:
- '**/details_harness|drop|3_2023-09-22T13-35-42.061271.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T13-35-42.061271.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T13_35_42.061271
path:
- '**/details_harness|gsm8k|5_2023-09-22T13-35-42.061271.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T13-35-42.061271.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hellaswag|10_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T13_35_42.061271
path:
- '**/details_harness|winogrande|5_2023-09-22T13-35-42.061271.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T13-35-42.061271.parquet'
- config_name: results
data_files:
- split: 2023_09_22T13_35_42.061271
path:
- results_2023-09-22T13-35-42.061271.parquet
- split: latest
path:
- results_2023-09-22T13-35-42.061271.parquet
---
# Dataset Card for Evaluation run of bofenghuang/vigogne-2-7b-chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bofenghuang/vigogne-2-7b-chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [bofenghuang/vigogne-2-7b-chat](https://huggingface.co/bofenghuang/vigogne-2-7b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bofenghuang__vigogne-2-7b-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T13:35:42.061271](https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-2-7b-chat/blob/main/results_2023-09-22T13-35-42.061271.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2779991610738255,
"em_stderr": 0.0045880722162316605,
"f1": 0.32825188758389273,
"f1_stderr": 0.004516960799751206,
"acc": 0.41080456661279235,
"acc_stderr": 0.00980948368433141
},
"harness|drop|3": {
"em": 0.2779991610738255,
"em_stderr": 0.0045880722162316605,
"f1": 0.32825188758389273,
"f1_stderr": 0.004516960799751206
},
"harness|gsm8k|5": {
"acc": 0.07733131159969674,
"acc_stderr": 0.007357713523222347
},
"harness|winogrande|5": {
"acc": 0.744277821625888,
"acc_stderr": 0.012261253845440473
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_dvruette__oasst-pythia-12b-6000-steps | 2023-08-27T12:45:17.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of dvruette/oasst-pythia-12b-6000-steps
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dvruette/oasst-pythia-12b-6000-steps](https://huggingface.co/dvruette/oasst-pythia-12b-6000-steps)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dvruette__oasst-pythia-12b-6000-steps\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-22T17:23:24.296836](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__oasst-pythia-12b-6000-steps/blob/main/results_2023-08-22T17%3A23%3A24.296836.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2668252745777657,\n\
\ \"acc_stderr\": 0.03187566965942044,\n \"acc_norm\": 0.27041889312856526,\n\
\ \"acc_norm_stderr\": 0.03187152132999994,\n \"mc1\": 0.2533659730722154,\n\
\ \"mc1_stderr\": 0.015225899340826842,\n \"mc2\": 0.3984757796276021,\n\
\ \"mc2_stderr\": 0.015360523593829214\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.41467576791808874,\n \"acc_stderr\": 0.01439707056440917,\n\
\ \"acc_norm\": 0.4539249146757679,\n \"acc_norm_stderr\": 0.014549221105171864\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5239992033459471,\n\
\ \"acc_stderr\": 0.004984030250507296,\n \"acc_norm\": 0.6967735510854411,\n\
\ \"acc_norm_stderr\": 0.004587128273935071\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3092105263157895,\n \"acc_stderr\": 0.03761070869867479,\n\
\ \"acc_norm\": 0.3092105263157895,\n \"acc_norm_stderr\": 0.03761070869867479\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2830188679245283,\n \"acc_stderr\": 0.027724236492700907,\n\
\ \"acc_norm\": 0.2830188679245283,\n \"acc_norm_stderr\": 0.027724236492700907\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2847222222222222,\n\
\ \"acc_stderr\": 0.03773809990686935,\n \"acc_norm\": 0.2847222222222222,\n\
\ \"acc_norm_stderr\": 0.03773809990686935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.19574468085106383,\n \"acc_stderr\": 0.025937853139977148,\n\
\ \"acc_norm\": 0.19574468085106383,\n \"acc_norm_stderr\": 0.025937853139977148\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2896551724137931,\n \"acc_stderr\": 0.03780019230438014,\n\
\ \"acc_norm\": 0.2896551724137931,\n \"acc_norm_stderr\": 0.03780019230438014\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633345,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633345\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.03852273364924316,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.03852273364924316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.24516129032258063,\n \"acc_stderr\": 0.02447224384089553,\n \"\
acc_norm\": 0.24516129032258063,\n \"acc_norm_stderr\": 0.02447224384089553\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.0317852971064275,\n \"acc_norm\"\
: 0.2857142857142857,\n \"acc_norm_stderr\": 0.0317852971064275\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.033175059300091805,\n\
\ \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.033175059300091805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3282828282828283,\n \"acc_stderr\": 0.03345678422756776,\n \"\
acc_norm\": 0.3282828282828283,\n \"acc_norm_stderr\": 0.03345678422756776\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.27979274611398963,\n \"acc_stderr\": 0.03239637046735702,\n\
\ \"acc_norm\": 0.27979274611398963,\n \"acc_norm_stderr\": 0.03239637046735702\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.022421273612923707,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.022421273612923707\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.02755361446786379,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02755361446786379\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.25137614678899084,\n\
\ \"acc_stderr\": 0.01859920636028741,\n \"acc_norm\": 0.25137614678899084,\n\
\ \"acc_norm_stderr\": 0.01859920636028741\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.031141447823536027,\n\
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.031141447823536027\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.28921568627450983,\n \"acc_stderr\": 0.031822318676475544,\n \"\
acc_norm\": 0.28921568627450983,\n \"acc_norm_stderr\": 0.031822318676475544\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.20253164556962025,\n \"acc_stderr\": 0.026160568246601464,\n \
\ \"acc_norm\": 0.20253164556962025,\n \"acc_norm_stderr\": 0.026160568246601464\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.19282511210762332,\n\
\ \"acc_stderr\": 0.02647824096048936,\n \"acc_norm\": 0.19282511210762332,\n\
\ \"acc_norm_stderr\": 0.02647824096048936\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.20610687022900764,\n \"acc_stderr\": 0.03547771004159462,\n\
\ \"acc_norm\": 0.20610687022900764,\n \"acc_norm_stderr\": 0.03547771004159462\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.371900826446281,\n \"acc_stderr\": 0.04412015806624504,\n \"acc_norm\"\
: 0.371900826446281,\n \"acc_norm_stderr\": 0.04412015806624504\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.041331194402438376,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.041331194402438376\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.27607361963190186,\n \"acc_stderr\": 0.03512385283705051,\n\
\ \"acc_norm\": 0.27607361963190186,\n \"acc_norm_stderr\": 0.03512385283705051\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.15178571428571427,\n\
\ \"acc_stderr\": 0.03405702838185693,\n \"acc_norm\": 0.15178571428571427,\n\
\ \"acc_norm_stderr\": 0.03405702838185693\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\
\ \"acc_stderr\": 0.02860595370200425,\n \"acc_norm\": 0.2564102564102564,\n\
\ \"acc_norm_stderr\": 0.02860595370200425\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2503192848020434,\n\
\ \"acc_stderr\": 0.015491088951494569,\n \"acc_norm\": 0.2503192848020434,\n\
\ \"acc_norm_stderr\": 0.015491088951494569\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2832369942196532,\n \"acc_stderr\": 0.02425790170532338,\n\
\ \"acc_norm\": 0.2832369942196532,\n \"acc_norm_stderr\": 0.02425790170532338\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961464,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961464\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.025261691219729484,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.025261691219729484\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.29260450160771706,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.29260450160771706,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902,\n \"acc_norm\"\
: 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902\n },\n\
\ \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2692307692307692,\n\
\ \"acc_stderr\": 0.011328734403140315,\n \"acc_norm\": 0.2692307692307692,\n\
\ \"acc_norm_stderr\": 0.011328734403140315\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.31985294117647056,\n \"acc_stderr\": 0.028332959514031218,\n\
\ \"acc_norm\": 0.31985294117647056,\n \"acc_norm_stderr\": 0.028332959514031218\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2581699346405229,\n \"acc_stderr\": 0.017704531653250075,\n \
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.017704531653250075\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24489795918367346,\n \"acc_stderr\": 0.027529637440174917,\n\
\ \"acc_norm\": 0.24489795918367346,\n \"acc_norm_stderr\": 0.027529637440174917\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2469879518072289,\n\
\ \"acc_stderr\": 0.03357351982064536,\n \"acc_norm\": 0.2469879518072289,\n\
\ \"acc_norm_stderr\": 0.03357351982064536\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.26900584795321636,\n \"acc_stderr\": 0.0340105262010409,\n\
\ \"acc_norm\": 0.26900584795321636,\n \"acc_norm_stderr\": 0.0340105262010409\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2533659730722154,\n\
\ \"mc1_stderr\": 0.015225899340826842,\n \"mc2\": 0.3984757796276021,\n\
\ \"mc2_stderr\": 0.015360523593829214\n }\n}\n```"
repo_url: https://huggingface.co/dvruette/oasst-pythia-12b-6000-steps
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|arc:challenge|25_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hellaswag|10_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T17:23:24.296836.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T17:23:24.296836.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T17_23_24.296836
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T17:23:24.296836.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T17:23:24.296836.parquet'
---
# Dataset Card for Evaluation run of dvruette/oasst-pythia-12b-6000-steps
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dvruette/oasst-pythia-12b-6000-steps
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [dvruette/oasst-pythia-12b-6000-steps](https://huggingface.co/dvruette/oasst-pythia-12b-6000-steps) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dvruette__oasst-pythia-12b-6000-steps",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-22T17:23:24.296836](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__oasst-pythia-12b-6000-steps/blob/main/results_2023-08-22T17%3A23%3A24.296836.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2668252745777657,
"acc_stderr": 0.03187566965942044,
"acc_norm": 0.27041889312856526,
"acc_norm_stderr": 0.03187152132999994,
"mc1": 0.2533659730722154,
"mc1_stderr": 0.015225899340826842,
"mc2": 0.3984757796276021,
"mc2_stderr": 0.015360523593829214
},
"harness|arc:challenge|25": {
"acc": 0.41467576791808874,
"acc_stderr": 0.01439707056440917,
"acc_norm": 0.4539249146757679,
"acc_norm_stderr": 0.014549221105171864
},
"harness|hellaswag|10": {
"acc": 0.5239992033459471,
"acc_stderr": 0.004984030250507296,
"acc_norm": 0.6967735510854411,
"acc_norm_stderr": 0.004587128273935071
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3092105263157895,
"acc_stderr": 0.03761070869867479,
"acc_norm": 0.3092105263157895,
"acc_norm_stderr": 0.03761070869867479
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2830188679245283,
"acc_stderr": 0.027724236492700907,
"acc_norm": 0.2830188679245283,
"acc_norm_stderr": 0.027724236492700907
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2847222222222222,
"acc_stderr": 0.03773809990686935,
"acc_norm": 0.2847222222222222,
"acc_norm_stderr": 0.03773809990686935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.19574468085106383,
"acc_stderr": 0.025937853139977148,
"acc_norm": 0.19574468085106383,
"acc_norm_stderr": 0.025937853139977148
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2896551724137931,
"acc_stderr": 0.03780019230438014,
"acc_norm": 0.2896551724137931,
"acc_norm_stderr": 0.03780019230438014
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633345,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633345
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924316,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24516129032258063,
"acc_stderr": 0.02447224384089553,
"acc_norm": 0.24516129032258063,
"acc_norm_stderr": 0.02447224384089553
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0317852971064275,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0317852971064275
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.033175059300091805,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.033175059300091805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3282828282828283,
"acc_stderr": 0.03345678422756776,
"acc_norm": 0.3282828282828283,
"acc_norm_stderr": 0.03345678422756776
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27979274611398963,
"acc_stderr": 0.03239637046735702,
"acc_norm": 0.27979274611398963,
"acc_norm_stderr": 0.03239637046735702
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.022421273612923707,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.022421273612923707
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.02755361446786379,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.02755361446786379
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25137614678899084,
"acc_stderr": 0.01859920636028741,
"acc_norm": 0.25137614678899084,
"acc_norm_stderr": 0.01859920636028741
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.031141447823536027,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.031141447823536027
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28921568627450983,
"acc_stderr": 0.031822318676475544,
"acc_norm": 0.28921568627450983,
"acc_norm_stderr": 0.031822318676475544
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.20253164556962025,
"acc_stderr": 0.026160568246601464,
"acc_norm": 0.20253164556962025,
"acc_norm_stderr": 0.026160568246601464
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.19282511210762332,
"acc_stderr": 0.02647824096048936,
"acc_norm": 0.19282511210762332,
"acc_norm_stderr": 0.02647824096048936
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.20610687022900764,
"acc_stderr": 0.03547771004159462,
"acc_norm": 0.20610687022900764,
"acc_norm_stderr": 0.03547771004159462
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.04412015806624504,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.04412015806624504
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.041331194402438376,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.041331194402438376
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.27607361963190186,
"acc_stderr": 0.03512385283705051,
"acc_norm": 0.27607361963190186,
"acc_norm_stderr": 0.03512385283705051
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.15178571428571427,
"acc_stderr": 0.03405702838185693,
"acc_norm": 0.15178571428571427,
"acc_norm_stderr": 0.03405702838185693
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.02860595370200425,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.02860595370200425
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2503192848020434,
"acc_stderr": 0.015491088951494569,
"acc_norm": 0.2503192848020434,
"acc_norm_stderr": 0.015491088951494569
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961464,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961464
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.025261691219729484,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.025261691219729484
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.29260450160771706,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.29260450160771706,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.25,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2692307692307692,
"acc_stderr": 0.011328734403140315,
"acc_norm": 0.2692307692307692,
"acc_norm_stderr": 0.011328734403140315
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.31985294117647056,
"acc_stderr": 0.028332959514031218,
"acc_norm": 0.31985294117647056,
"acc_norm_stderr": 0.028332959514031218
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.017704531653250075,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.017704531653250075
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24489795918367346,
"acc_stderr": 0.027529637440174917,
"acc_norm": 0.24489795918367346,
"acc_norm_stderr": 0.027529637440174917
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2469879518072289,
"acc_stderr": 0.03357351982064536,
"acc_norm": 0.2469879518072289,
"acc_norm_stderr": 0.03357351982064536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.26900584795321636,
"acc_stderr": 0.0340105262010409,
"acc_norm": 0.26900584795321636,
"acc_norm_stderr": 0.0340105262010409
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2533659730722154,
"mc1_stderr": 0.015225899340826842,
"mc2": 0.3984757796276021,
"mc2_stderr": 0.015360523593829214
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v1 | 2023-08-27T12:45:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of yeontaek/llama-2-13B-ensemble-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/llama-2-13B-ensemble-v1](https://huggingface.co/yeontaek/llama-2-13B-ensemble-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-24T06:11:39.305449](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v1/blob/main/results_2023-08-24T06%3A11%3A39.305449.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5771705795394726,\n\
\ \"acc_stderr\": 0.03407019951989153,\n \"acc_norm\": 0.5809029426034396,\n\
\ \"acc_norm_stderr\": 0.0340502352653076,\n \"mc1\": 0.3598531211750306,\n\
\ \"mc1_stderr\": 0.016801860466677157,\n \"mc2\": 0.5015608436175973,\n\
\ \"mc2_stderr\": 0.015270015874980385\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5998293515358362,\n \"acc_stderr\": 0.014317197787809169,\n\
\ \"acc_norm\": 0.6228668941979523,\n \"acc_norm_stderr\": 0.0141633668961926\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6264688309101772,\n\
\ \"acc_stderr\": 0.004827526584889677,\n \"acc_norm\": 0.8236407090221072,\n\
\ \"acc_norm_stderr\": 0.0038034664560544717\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955784,\n\
\ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955784\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6319444444444444,\n\
\ \"acc_stderr\": 0.040329990539607175,\n \"acc_norm\": 0.6319444444444444,\n\
\ \"acc_norm_stderr\": 0.040329990539607175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.03794012674697031,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.03794012674697031\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192118,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192118\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.02422996529842507,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.02422996529842507\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.042857142857142816,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.042857142857142816\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6580645161290323,\n \"acc_stderr\": 0.026985289576552746,\n \"\
acc_norm\": 0.6580645161290323,\n \"acc_norm_stderr\": 0.026985289576552746\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.03481904844438803,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03481904844438803\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964683,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964683\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.02749350424454806,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.02749350424454806\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5692307692307692,\n \"acc_stderr\": 0.02510682066053975,\n \
\ \"acc_norm\": 0.5692307692307692,\n \"acc_norm_stderr\": 0.02510682066053975\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02784081149587192,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02784081149587192\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6092436974789915,\n \"acc_stderr\": 0.03169380235712996,\n \
\ \"acc_norm\": 0.6092436974789915,\n \"acc_norm_stderr\": 0.03169380235712996\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7651376146788991,\n \"acc_stderr\": 0.018175110510343564,\n \"\
acc_norm\": 0.7651376146788991,\n \"acc_norm_stderr\": 0.018175110510343564\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.024662496845209807,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.024662496845209807\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.776500638569604,\n\
\ \"acc_stderr\": 0.01489723522945071,\n \"acc_norm\": 0.776500638569604,\n\
\ \"acc_norm_stderr\": 0.01489723522945071\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.025624723994030454,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.025624723994030454\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4324022346368715,\n\
\ \"acc_stderr\": 0.01656897123354861,\n \"acc_norm\": 0.4324022346368715,\n\
\ \"acc_norm_stderr\": 0.01656897123354861\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.02763417668960266,\n\
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.02763417668960266\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.026915003011380154,\n\
\ \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.026915003011380154\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4178617992177314,\n\
\ \"acc_stderr\": 0.012596744108998565,\n \"acc_norm\": 0.4178617992177314,\n\
\ \"acc_norm_stderr\": 0.012596744108998565\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5330882352941176,\n \"acc_stderr\": 0.030306257722468304,\n\
\ \"acc_norm\": 0.5330882352941176,\n \"acc_norm_stderr\": 0.030306257722468304\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5849673202614379,\n \"acc_stderr\": 0.01993362777685742,\n \
\ \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.01993362777685742\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030806,\n\
\ \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030806\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.031524391865554,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.031524391865554\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3598531211750306,\n\
\ \"mc1_stderr\": 0.016801860466677157,\n \"mc2\": 0.5015608436175973,\n\
\ \"mc2_stderr\": 0.015270015874980385\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/llama-2-13B-ensemble-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|arc:challenge|25_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hellaswag|10_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T06:11:39.305449.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T06:11:39.305449.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_24T06_11_39.305449
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T06:11:39.305449.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T06:11:39.305449.parquet'
---
# Dataset Card for Evaluation run of yeontaek/llama-2-13B-ensemble-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/llama-2-13B-ensemble-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/llama-2-13B-ensemble-v1](https://huggingface.co/yeontaek/llama-2-13B-ensemble-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-24T06:11:39.305449](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v1/blob/main/results_2023-08-24T06%3A11%3A39.305449.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5771705795394726,
"acc_stderr": 0.03407019951989153,
"acc_norm": 0.5809029426034396,
"acc_norm_stderr": 0.0340502352653076,
"mc1": 0.3598531211750306,
"mc1_stderr": 0.016801860466677157,
"mc2": 0.5015608436175973,
"mc2_stderr": 0.015270015874980385
},
"harness|arc:challenge|25": {
"acc": 0.5998293515358362,
"acc_stderr": 0.014317197787809169,
"acc_norm": 0.6228668941979523,
"acc_norm_stderr": 0.0141633668961926
},
"harness|hellaswag|10": {
"acc": 0.6264688309101772,
"acc_stderr": 0.004827526584889677,
"acc_norm": 0.8236407090221072,
"acc_norm_stderr": 0.0038034664560544717
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955784,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955784
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6319444444444444,
"acc_stderr": 0.040329990539607175,
"acc_norm": 0.6319444444444444,
"acc_norm_stderr": 0.040329990539607175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.03794012674697031,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.03794012674697031
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46382978723404256,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.46382978723404256,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.02422996529842507,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.02422996529842507
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.042857142857142816,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.042857142857142816
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6580645161290323,
"acc_stderr": 0.026985289576552746,
"acc_norm": 0.6580645161290323,
"acc_norm_stderr": 0.026985289576552746
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03481904844438803,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03481904844438803
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.03115626951964683,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.03115626951964683
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.02749350424454806,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.02749350424454806
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5692307692307692,
"acc_stderr": 0.02510682066053975,
"acc_norm": 0.5692307692307692,
"acc_norm_stderr": 0.02510682066053975
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02784081149587192,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02784081149587192
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6092436974789915,
"acc_stderr": 0.03169380235712996,
"acc_norm": 0.6092436974789915,
"acc_norm_stderr": 0.03169380235712996
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7651376146788991,
"acc_stderr": 0.018175110510343564,
"acc_norm": 0.7651376146788991,
"acc_norm_stderr": 0.018175110510343564
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.024662496845209807,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.024662496845209807
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.776500638569604,
"acc_stderr": 0.01489723522945071,
"acc_norm": 0.776500638569604,
"acc_norm_stderr": 0.01489723522945071
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.025624723994030454,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.025624723994030454
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4324022346368715,
"acc_stderr": 0.01656897123354861,
"acc_norm": 0.4324022346368715,
"acc_norm_stderr": 0.01656897123354861
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.02763417668960266,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.02763417668960266
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.026915003011380154,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.026915003011380154
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4178617992177314,
"acc_stderr": 0.012596744108998565,
"acc_norm": 0.4178617992177314,
"acc_norm_stderr": 0.012596744108998565
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5330882352941176,
"acc_stderr": 0.030306257722468304,
"acc_norm": 0.5330882352941176,
"acc_norm_stderr": 0.030306257722468304
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.01993362777685742,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.01993362777685742
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030806,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030806
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.031524391865554,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.031524391865554
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3598531211750306,
"mc1_stderr": 0.016801860466677157,
"mc2": 0.5015608436175973,
"mc2_stderr": 0.015270015874980385
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-LoRa-v2 | 2023-08-27T12:45:20.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of yeontaek/Platypus2xOpenOrca-13B-LoRa-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/Platypus2xOpenOrca-13B-LoRa-v2](https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-LoRa-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-LoRa-v2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-23T06:03:44.232629](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-LoRa-v2/blob/main/results_2023-08-23T06%3A03%3A44.232629.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5049354082266176,\n\
\ \"acc_stderr\": 0.03490471110016842,\n \"acc_norm\": 0.5089570434188554,\n\
\ \"acc_norm_stderr\": 0.03488592349658449,\n \"mc1\": 0.30354957160342716,\n\
\ \"mc1_stderr\": 0.016095884155386854,\n \"mc2\": 0.4343083766358028,\n\
\ \"mc2_stderr\": 0.01488144822088394\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5520477815699659,\n \"acc_stderr\": 0.014532011498211672,\n\
\ \"acc_norm\": 0.5861774744027304,\n \"acc_norm_stderr\": 0.01439273000922101\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6085441147181836,\n\
\ \"acc_stderr\": 0.00487078503670829,\n \"acc_norm\": 0.8116908982274448,\n\
\ \"acc_norm_stderr\": 0.0039015979142464933\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4342105263157895,\n \"acc_stderr\": 0.04033565667848319,\n\
\ \"acc_norm\": 0.4342105263157895,\n \"acc_norm_stderr\": 0.04033565667848319\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5471698113207547,\n \"acc_stderr\": 0.030635627957961823,\n\
\ \"acc_norm\": 0.5471698113207547,\n \"acc_norm_stderr\": 0.030635627957961823\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4930555555555556,\n\
\ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.4930555555555556,\n\
\ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.45664739884393063,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.45664739884393063,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237656,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237656\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.03208115750788683,\n\
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.03208115750788683\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.038924311065187546,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.038924311065187546\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3724137931034483,\n \"acc_stderr\": 0.04028731532947558,\n\
\ \"acc_norm\": 0.3724137931034483,\n \"acc_norm_stderr\": 0.04028731532947558\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.02397386199899207,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.02397386199899207\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5741935483870968,\n \"acc_stderr\": 0.028129112709165904,\n \"\
acc_norm\": 0.5741935483870968,\n \"acc_norm_stderr\": 0.028129112709165904\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.39408866995073893,\n \"acc_stderr\": 0.03438157967036545,\n \"\
acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.03438157967036545\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5606060606060606,\n \"acc_stderr\": 0.0353608594752948,\n \"acc_norm\"\
: 0.5606060606060606,\n \"acc_norm_stderr\": 0.0353608594752948\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.6321243523316062,\n \"acc_stderr\": 0.034801756684660366,\n\
\ \"acc_norm\": 0.6321243523316062,\n \"acc_norm_stderr\": 0.034801756684660366\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4846153846153846,\n \"acc_stderr\": 0.025339003010106515,\n\
\ \"acc_norm\": 0.4846153846153846,\n \"acc_norm_stderr\": 0.025339003010106515\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.03242225027115007,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03242225027115007\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7119266055045872,\n \"acc_stderr\": 0.019416445892636032,\n \"\
acc_norm\": 0.7119266055045872,\n \"acc_norm_stderr\": 0.019416445892636032\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.35185185185185186,\n \"acc_stderr\": 0.032568505702936484,\n \"\
acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.032568505702936484\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7205882352941176,\n \"acc_stderr\": 0.031493281045079556,\n \"\
acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.031493281045079556\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035296,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035296\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5267175572519084,\n \"acc_stderr\": 0.04379024936553893,\n\
\ \"acc_norm\": 0.5267175572519084,\n \"acc_norm_stderr\": 0.04379024936553893\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.628099173553719,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n\
\ \"acc_stderr\": 0.04792898170907062,\n \"acc_norm\": 0.5648148148148148,\n\
\ \"acc_norm_stderr\": 0.04792898170907062\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6196319018404908,\n \"acc_stderr\": 0.03814269893261838,\n\
\ \"acc_norm\": 0.6196319018404908,\n \"acc_norm_stderr\": 0.03814269893261838\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6213592233009708,\n \"acc_stderr\": 0.04802694698258973,\n\
\ \"acc_norm\": 0.6213592233009708,\n \"acc_norm_stderr\": 0.04802694698258973\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7649572649572649,\n\
\ \"acc_stderr\": 0.02777883590493543,\n \"acc_norm\": 0.7649572649572649,\n\
\ \"acc_norm_stderr\": 0.02777883590493543\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7215836526181354,\n\
\ \"acc_stderr\": 0.01602829518899248,\n \"acc_norm\": 0.7215836526181354,\n\
\ \"acc_norm_stderr\": 0.01602829518899248\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5491329479768786,\n \"acc_stderr\": 0.026788811931562757,\n\
\ \"acc_norm\": 0.5491329479768786,\n \"acc_norm_stderr\": 0.026788811931562757\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29608938547486036,\n\
\ \"acc_stderr\": 0.015268677317602269,\n \"acc_norm\": 0.29608938547486036,\n\
\ \"acc_norm_stderr\": 0.015268677317602269\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.477124183006536,\n \"acc_stderr\": 0.028599936776089782,\n\
\ \"acc_norm\": 0.477124183006536,\n \"acc_norm_stderr\": 0.028599936776089782\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6077170418006431,\n\
\ \"acc_stderr\": 0.027731258647011998,\n \"acc_norm\": 0.6077170418006431,\n\
\ \"acc_norm_stderr\": 0.027731258647011998\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6141975308641975,\n \"acc_stderr\": 0.027085401226132143,\n\
\ \"acc_norm\": 0.6141975308641975,\n \"acc_norm_stderr\": 0.027085401226132143\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4326241134751773,\n \"acc_stderr\": 0.029555454236778845,\n \
\ \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.029555454236778845\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42503259452411996,\n\
\ \"acc_stderr\": 0.012625879884891996,\n \"acc_norm\": 0.42503259452411996,\n\
\ \"acc_norm_stderr\": 0.012625879884891996\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.030320243265004137,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.030320243265004137\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5277777777777778,\n \"acc_stderr\": 0.020196594933541197,\n \
\ \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.020196594933541197\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.42448979591836733,\n \"acc_stderr\": 0.031642094879429414,\n\
\ \"acc_norm\": 0.42448979591836733,\n \"acc_norm_stderr\": 0.031642094879429414\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6517412935323383,\n\
\ \"acc_stderr\": 0.033687874661154596,\n \"acc_norm\": 0.6517412935323383,\n\
\ \"acc_norm_stderr\": 0.033687874661154596\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30354957160342716,\n\
\ \"mc1_stderr\": 0.016095884155386854,\n \"mc2\": 0.4343083766358028,\n\
\ \"mc2_stderr\": 0.01488144822088394\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-LoRa-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|arc:challenge|25_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hellaswag|10_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T06:03:44.232629.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T06:03:44.232629.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T06_03_44.232629
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T06:03:44.232629.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T06:03:44.232629.parquet'
---
# Dataset Card for Evaluation run of yeontaek/Platypus2xOpenOrca-13B-LoRa-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-LoRa-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/Platypus2xOpenOrca-13B-LoRa-v2](https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-LoRa-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-LoRa-v2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-23T06:03:44.232629](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-LoRa-v2/blob/main/results_2023-08-23T06%3A03%3A44.232629.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5049354082266176,
"acc_stderr": 0.03490471110016842,
"acc_norm": 0.5089570434188554,
"acc_norm_stderr": 0.03488592349658449,
"mc1": 0.30354957160342716,
"mc1_stderr": 0.016095884155386854,
"mc2": 0.4343083766358028,
"mc2_stderr": 0.01488144822088394
},
"harness|arc:challenge|25": {
"acc": 0.5520477815699659,
"acc_stderr": 0.014532011498211672,
"acc_norm": 0.5861774744027304,
"acc_norm_stderr": 0.01439273000922101
},
"harness|hellaswag|10": {
"acc": 0.6085441147181836,
"acc_stderr": 0.00487078503670829,
"acc_norm": 0.8116908982274448,
"acc_norm_stderr": 0.0039015979142464933
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4342105263157895,
"acc_stderr": 0.04033565667848319,
"acc_norm": 0.4342105263157895,
"acc_norm_stderr": 0.04033565667848319
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5471698113207547,
"acc_stderr": 0.030635627957961823,
"acc_norm": 0.5471698113207547,
"acc_norm_stderr": 0.030635627957961823
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4930555555555556,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.4930555555555556,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.45664739884393063,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.45664739884393063,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237656,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237656
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.03208115750788683,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.03208115750788683
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.038924311065187546,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.038924311065187546
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3724137931034483,
"acc_stderr": 0.04028731532947558,
"acc_norm": 0.3724137931034483,
"acc_norm_stderr": 0.04028731532947558
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.02397386199899207,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.02397386199899207
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5741935483870968,
"acc_stderr": 0.028129112709165904,
"acc_norm": 0.5741935483870968,
"acc_norm_stderr": 0.028129112709165904
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39408866995073893,
"acc_stderr": 0.03438157967036545,
"acc_norm": 0.39408866995073893,
"acc_norm_stderr": 0.03438157967036545
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5606060606060606,
"acc_stderr": 0.0353608594752948,
"acc_norm": 0.5606060606060606,
"acc_norm_stderr": 0.0353608594752948
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6321243523316062,
"acc_stderr": 0.034801756684660366,
"acc_norm": 0.6321243523316062,
"acc_norm_stderr": 0.034801756684660366
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4846153846153846,
"acc_stderr": 0.025339003010106515,
"acc_norm": 0.4846153846153846,
"acc_norm_stderr": 0.025339003010106515
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03242225027115007,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03242225027115007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7119266055045872,
"acc_stderr": 0.019416445892636032,
"acc_norm": 0.7119266055045872,
"acc_norm_stderr": 0.019416445892636032
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.032568505702936484,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.032568505702936484
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.031493281045079556,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.031493281045079556
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.028304657943035296,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.028304657943035296
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5267175572519084,
"acc_stderr": 0.04379024936553893,
"acc_norm": 0.5267175572519084,
"acc_norm_stderr": 0.04379024936553893
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.04792898170907062,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.04792898170907062
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6196319018404908,
"acc_stderr": 0.03814269893261838,
"acc_norm": 0.6196319018404908,
"acc_norm_stderr": 0.03814269893261838
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973647,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973647
},
"harness|hendrycksTest-management|5": {
"acc": 0.6213592233009708,
"acc_stderr": 0.04802694698258973,
"acc_norm": 0.6213592233009708,
"acc_norm_stderr": 0.04802694698258973
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7649572649572649,
"acc_stderr": 0.02777883590493543,
"acc_norm": 0.7649572649572649,
"acc_norm_stderr": 0.02777883590493543
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7215836526181354,
"acc_stderr": 0.01602829518899248,
"acc_norm": 0.7215836526181354,
"acc_norm_stderr": 0.01602829518899248
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.026788811931562757,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.026788811931562757
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29608938547486036,
"acc_stderr": 0.015268677317602269,
"acc_norm": 0.29608938547486036,
"acc_norm_stderr": 0.015268677317602269
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.477124183006536,
"acc_stderr": 0.028599936776089782,
"acc_norm": 0.477124183006536,
"acc_norm_stderr": 0.028599936776089782
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6077170418006431,
"acc_stderr": 0.027731258647011998,
"acc_norm": 0.6077170418006431,
"acc_norm_stderr": 0.027731258647011998
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6141975308641975,
"acc_stderr": 0.027085401226132143,
"acc_norm": 0.6141975308641975,
"acc_norm_stderr": 0.027085401226132143
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.029555454236778845,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.029555454236778845
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42503259452411996,
"acc_stderr": 0.012625879884891996,
"acc_norm": 0.42503259452411996,
"acc_norm_stderr": 0.012625879884891996
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.030320243265004137,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.030320243265004137
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.020196594933541197,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.020196594933541197
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.42448979591836733,
"acc_stderr": 0.031642094879429414,
"acc_norm": 0.42448979591836733,
"acc_norm_stderr": 0.031642094879429414
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6517412935323383,
"acc_stderr": 0.033687874661154596,
"acc_norm": 0.6517412935323383,
"acc_norm_stderr": 0.033687874661154596
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30354957160342716,
"mc1_stderr": 0.016095884155386854,
"mc2": 0.4343083766358028,
"mc2_stderr": 0.01488144822088394
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3-ensemble | 2023-08-27T12:45:22.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of yeontaek/Platypus2xOpenOrca-13B-IA3-ensemble
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/Platypus2xOpenOrca-13B-IA3-ensemble](https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-IA3-ensemble)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3-ensemble\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-23T14:48:47.168259](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3-ensemble/blob/main/results_2023-08-23T14%3A48%3A47.168259.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.591031165151078,\n\
\ \"acc_stderr\": 0.033910448517384374,\n \"acc_norm\": 0.5951164394626702,\n\
\ \"acc_norm_stderr\": 0.033889044058760844,\n \"mc1\": 0.3317013463892289,\n\
\ \"mc1_stderr\": 0.016482148810241477,\n \"mc2\": 0.47461350738527963,\n\
\ \"mc2_stderr\": 0.015202805791318129\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5844709897610921,\n \"acc_stderr\": 0.014401366641216386,\n\
\ \"acc_norm\": 0.621160409556314,\n \"acc_norm_stderr\": 0.014175915490000326\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6185022903804023,\n\
\ \"acc_stderr\": 0.004847615216473461,\n \"acc_norm\": 0.8228440549691296,\n\
\ \"acc_norm_stderr\": 0.0038102033089010925\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n\
\ \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n\
\ \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n\
\ \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.5664739884393064,\n\
\ \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.34656084656084657,\n \"acc_stderr\": 0.02450877752102842,\n \"\
acc_norm\": 0.34656084656084657,\n \"acc_norm_stderr\": 0.02450877752102842\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7064516129032258,\n \"acc_stderr\": 0.025906087021319295,\n \"\
acc_norm\": 0.7064516129032258,\n \"acc_norm_stderr\": 0.025906087021319295\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486519,\n \"\
acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486519\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5871794871794872,\n \"acc_stderr\": 0.024962683564331806,\n\
\ \"acc_norm\": 0.5871794871794872,\n \"acc_norm_stderr\": 0.024962683564331806\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340496,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7761467889908257,\n \"acc_stderr\": 0.017871217767790215,\n \"\
acc_norm\": 0.7761467889908257,\n \"acc_norm_stderr\": 0.017871217767790215\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.04039314978724561,\n\
\ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.04039314978724561\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.02363687331748927,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.02363687331748927\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7547892720306514,\n\
\ \"acc_stderr\": 0.015384352284543941,\n \"acc_norm\": 0.7547892720306514,\n\
\ \"acc_norm_stderr\": 0.015384352284543941\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.025722802200895803,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.025722802200895803\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4491620111731844,\n\
\ \"acc_stderr\": 0.01663583834163192,\n \"acc_norm\": 0.4491620111731844,\n\
\ \"acc_norm_stderr\": 0.01663583834163192\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.027780141207023337,\n\
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.027780141207023337\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6430868167202572,\n\
\ \"acc_stderr\": 0.027210420375934023,\n \"acc_norm\": 0.6430868167202572,\n\
\ \"acc_norm_stderr\": 0.027210420375934023\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409828,\n\
\ \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409828\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303055,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303055\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4517601043024772,\n\
\ \"acc_stderr\": 0.012710662233660247,\n \"acc_norm\": 0.4517601043024772,\n\
\ \"acc_norm_stderr\": 0.012710662233660247\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5661764705882353,\n \"acc_stderr\": 0.03010563657001663,\n\
\ \"acc_norm\": 0.5661764705882353,\n \"acc_norm_stderr\": 0.03010563657001663\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6029411764705882,\n \"acc_stderr\": 0.019794488900024124,\n \
\ \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.019794488900024124\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.02971932942241748,\n\
\ \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.02971932942241748\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n\
\ \"acc_stderr\": 0.02992941540834839,\n \"acc_norm\": 0.7661691542288557,\n\
\ \"acc_norm_stderr\": 0.02992941540834839\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3317013463892289,\n\
\ \"mc1_stderr\": 0.016482148810241477,\n \"mc2\": 0.47461350738527963,\n\
\ \"mc2_stderr\": 0.015202805791318129\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-IA3-ensemble
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|arc:challenge|25_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hellaswag|10_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T14:48:47.168259.parquet'
---
# Dataset Card for Evaluation run of yeontaek/Platypus2xOpenOrca-13B-IA3-ensemble
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-IA3-ensemble
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/Platypus2xOpenOrca-13B-IA3-ensemble](https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-IA3-ensemble) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3-ensemble",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-23T14:48:47.168259](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3-ensemble/blob/main/results_2023-08-23T14%3A48%3A47.168259.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.591031165151078,
"acc_stderr": 0.033910448517384374,
"acc_norm": 0.5951164394626702,
"acc_norm_stderr": 0.033889044058760844,
"mc1": 0.3317013463892289,
"mc1_stderr": 0.016482148810241477,
"mc2": 0.47461350738527963,
"mc2_stderr": 0.015202805791318129
},
"harness|arc:challenge|25": {
"acc": 0.5844709897610921,
"acc_stderr": 0.014401366641216386,
"acc_norm": 0.621160409556314,
"acc_norm_stderr": 0.014175915490000326
},
"harness|hellaswag|10": {
"acc": 0.6185022903804023,
"acc_stderr": 0.004847615216473461,
"acc_norm": 0.8228440549691296,
"acc_norm_stderr": 0.0038102033089010925
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.34656084656084657,
"acc_stderr": 0.02450877752102842,
"acc_norm": 0.34656084656084657,
"acc_norm_stderr": 0.02450877752102842
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7064516129032258,
"acc_stderr": 0.025906087021319295,
"acc_norm": 0.7064516129032258,
"acc_norm_stderr": 0.025906087021319295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486519,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486519
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5871794871794872,
"acc_stderr": 0.024962683564331806,
"acc_norm": 0.5871794871794872,
"acc_norm_stderr": 0.024962683564331806
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340496,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7761467889908257,
"acc_stderr": 0.017871217767790215,
"acc_norm": 0.7761467889908257,
"acc_norm_stderr": 0.017871217767790215
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.04039314978724561,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.04039314978724561
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.036028141763926456,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.036028141763926456
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.02363687331748927,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.02363687331748927
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7547892720306514,
"acc_stderr": 0.015384352284543941,
"acc_norm": 0.7547892720306514,
"acc_norm_stderr": 0.015384352284543941
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.025722802200895803,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.025722802200895803
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4491620111731844,
"acc_stderr": 0.01663583834163192,
"acc_norm": 0.4491620111731844,
"acc_norm_stderr": 0.01663583834163192
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.027780141207023337,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.027780141207023337
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6430868167202572,
"acc_stderr": 0.027210420375934023,
"acc_norm": 0.6430868167202572,
"acc_norm_stderr": 0.027210420375934023
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409828,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409828
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303055,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303055
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4517601043024772,
"acc_stderr": 0.012710662233660247,
"acc_norm": 0.4517601043024772,
"acc_norm_stderr": 0.012710662233660247
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5661764705882353,
"acc_stderr": 0.03010563657001663,
"acc_norm": 0.5661764705882353,
"acc_norm_stderr": 0.03010563657001663
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.019794488900024124,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.019794488900024124
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.02971932942241748,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.02971932942241748
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7661691542288557,
"acc_stderr": 0.02992941540834839,
"acc_norm": 0.7661691542288557,
"acc_norm_stderr": 0.02992941540834839
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3317013463892289,
"mc1_stderr": 0.016482148810241477,
"mc2": 0.47461350738527963,
"mc2_stderr": 0.015202805791318129
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3-v4 | 2023-09-18T08:06:37.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of yeontaek/Platypus2xOpenOrca-13B-IA3-v4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/Platypus2xOpenOrca-13B-IA3-v4](https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-IA3-v4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3-v4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-18T08:06:24.321760](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3-v4/blob/main/results_2023-09-18T08-06-24.321760.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.006396812080536913,\n\
\ \"em_stderr\": 0.0008164468837432404,\n \"f1\": 0.0888349412751677,\n\
\ \"f1_stderr\": 0.0018756142649103522,\n \"acc\": 0.4401587986402365,\n\
\ \"acc_stderr\": 0.010178434162145544\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.006396812080536913,\n \"em_stderr\": 0.0008164468837432404,\n\
\ \"f1\": 0.0888349412751677,\n \"f1_stderr\": 0.0018756142649103522\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10841546626231995,\n \
\ \"acc_stderr\": 0.008563852506627488\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.011793015817663597\n\
\ }\n}\n```"
repo_url: https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-IA3-v4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|arc:challenge|25_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_18T08_06_24.321760
path:
- '**/details_harness|drop|3_2023-09-18T08-06-24.321760.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-18T08-06-24.321760.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_18T08_06_24.321760
path:
- '**/details_harness|gsm8k|5_2023-09-18T08-06-24.321760.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-18T08-06-24.321760.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hellaswag|10_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T14:45:16.156132.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T14_45_16.156132
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T14:45:16.156132.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T14:45:16.156132.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_18T08_06_24.321760
path:
- '**/details_harness|winogrande|5_2023-09-18T08-06-24.321760.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-18T08-06-24.321760.parquet'
- config_name: results
data_files:
- split: 2023_09_18T08_06_24.321760
path:
- results_2023-09-18T08-06-24.321760.parquet
- split: latest
path:
- results_2023-09-18T08-06-24.321760.parquet
---
# Dataset Card for Evaluation run of yeontaek/Platypus2xOpenOrca-13B-IA3-v4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-IA3-v4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/Platypus2xOpenOrca-13B-IA3-v4](https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-IA3-v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3-v4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T08:06:24.321760](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3-v4/blob/main/results_2023-09-18T08-06-24.321760.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.006396812080536913,
"em_stderr": 0.0008164468837432404,
"f1": 0.0888349412751677,
"f1_stderr": 0.0018756142649103522,
"acc": 0.4401587986402365,
"acc_stderr": 0.010178434162145544
},
"harness|drop|3": {
"em": 0.006396812080536913,
"em_stderr": 0.0008164468837432404,
"f1": 0.0888349412751677,
"f1_stderr": 0.0018756142649103522
},
"harness|gsm8k|5": {
"acc": 0.10841546626231995,
"acc_stderr": 0.008563852506627488
},
"harness|winogrande|5": {
"acc": 0.7719021310181531,
"acc_stderr": 0.011793015817663597
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.