id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
dotan1111/MSA-nuc-10-seq | 2023-09-18T11:50:27.000Z | [
"sequence-to-sequence",
"bioinformatics",
"biology",
"region:us"
] | dotan1111 | null | null | null | 0 | 0 | ---
tags:
- sequence-to-sequence
- bioinformatics
- biology
---
# Multiple Sequence Alignment as a Sequence-to-Sequence Learning Problem
## Abstract:
The sequence alignment problem is one of the most fundamental problems in bioinformatics and a plethora of methods were devised to tackle it. Here we introduce BetaAlign, a methodology for aligning sequences using an NLP approach. BetaAlign accounts for the possible variability of the evolutionary process among different datasets by using an ensemble of transformers, each trained on millions of samples generated from a different evolutionary model. Our approach leads to alignment accuracy that is similar and often better than commonly used methods, such as MAFFT, DIALIGN, ClustalW, T-Coffee, PRANK, and MUSCLE.

An illustration of aligning sequences with sequence-to-sequence learning. (a) Consider two input sequences "AAG" and "ACGG". (b) The result of encoding the unaligned sequences into the source language (*Concat* representation). (c) The sentence from the source language is translated to the target language via a transformer model. (d) The translated sentence in the target language (*Spaces* representation). (e) The resulting alignment, decoded from the translated sentence, in which "AA-G" is aligned to "ACGG". The transformer architecture illustration is adapted from (Vaswani et al., 2017).
## Data:
We used SpartaABC (Loewenthal et al., 2021) to generate millions of true alignments. SpartaABC requires the following input: (1) a rooted phylogenetic tree, which includes a topology and branch lengths; (2) a substitution model (amino acids or nucleotides); (3) root sequence length; (4) the indel model parameters, which include: insertion rate (*R_I*), deletion rate (*R_D*), a parameter for the insertion Zipfian distribution (*A_I*), and a parameter for the deletion Zipfian distribution (*A_D*). MSAs were simulated along random phylogenetic tree topologies generated using the program ETE version 3.0 (Huerta-Cepas et al., 2016) with default parameters.
We generated 1,495,000, 2,000 and 3,000, protein MSAs with ten sequences that were used as training validation and testing data, respectively. We generated the same number of DNA MSAs. For each random tree, branch lengths were drawn from a uniform distribution in the range *(0.5,1.0)*. Next, the sequences were generated using SpartaABC with the following parameters: *R_I,R_D \in (0.0,0.05)*, *A_I, A_D \in (1.01,2.0)*. The alignment lengths as well as the sequence lengths of the tree leaves vary within and among datasets as they depend on the indel dynamics and the root length. The root length was sampled uniformly in the range *[32,44]*. Unless stated otherwise, all protein datasets were generated with the WAG+G model, and all DNA datasets were generated with the GTR+G model, with the following parameters: (1) frequencies for the different nucleotides *(0.37, 0.166, 0.307, 0.158)*, in the order "T", "C", "A" and "G"; (2) with the substitutions rate *(0.444, 0.0843, 0.116, 0.107, 0.00027)*, in the order "a", "b", "c", "d", and "e" for the substitution matrix.
## Example:
The following example correspond for the illustrated MSA in the figure above:
{"MSA": "AAAC-GGG", "unaligned_seqs": {"seq0": "AAG", "seq1": "ACGG"}}
## APA
```
Dotan, E., Belinkov, Y., Avram, O., Wygoda, E., Ecker, N., Alburquerque, M., Keren, O., Loewenthal, G., & Pupko T. (2023). Multiple sequence alignment as a sequence-to-sequence learning problem. The Eleventh International Conference on Learning Representations (ICLR 2023).
```
## BibTeX
```
@article{Dotan_multiple_2023,
author = {Dotan, Edo and Belinkov, Yonatan and Avram, Oren and Wygoda, Elya and Ecker, Noa and Alburquerque, Michael and Keren, Omri and Loewenthal, Gil and Pupko, Tal},
month = aug,
title = {{Multiple sequence alignment as a sequence-to-sequence learning problem}},
year = {2023}
}
``` |
Amanaccessassist/learn | 2023-09-18T11:09:44.000Z | [
"region:us"
] | Amanaccessassist | null | null | null | 0 | 0 | Entry not found |
RiazHussain/indian_food_images | 2023-09-19T07:20:57.000Z | [
"region:us"
] | RiazHussain | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': burger
'1': butter_naan
'2': chai
'3': chapati
'4': chole_bhature
'5': dal_makhani
'6': dhokla
'7': fried_rice
'8': idli
'9': jalebi
'10': kaathi_rolls
'11': kadai_paneer
'12': kulfi
'13': masala_dosa
'14': momos
'15': paani_puri
'16': pakode
'17': pav_bhaji
'18': pizza
'19': samosa
splits:
- name: train
num_bytes: 1370201244.9594336
num_examples: 5328
- name: test
num_bytes: 208936489.3925666
num_examples: 941
download_size: 1601617594
dataset_size: 1579137734.3520002
---
# Dataset Card for "indian_food_images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BangumiBase/renaiflops | 2023-09-29T08:58:27.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Ren`ai Flops
This is the image base of bangumi Ren`ai Flops, we detected 19 characters, 1980 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 714 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 182 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 10 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 8 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 13 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 19 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 170 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 95 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 47 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 101 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 197 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 42 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 8 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 74 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 169 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 6 | [Download](15/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 16 | 7 | [Download](16/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 17 | 6 | [Download](17/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| noise | 112 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
spyropoulos/products | 2023-09-18T11:54:21.000Z | [
"region:us"
] | spyropoulos | null | null | null | 0 | 0 | Entry not found |
DucHaiten/dantocdao | 2023-09-18T11:57:44.000Z | [
"license:creativeml-openrail-m",
"region:us"
] | DucHaiten | null | null | null | 0 | 0 | ---
license: creativeml-openrail-m
---
|
nielsr/test-maskrcnn | 2023-09-18T13:25:22.000Z | [
"region:us"
] | nielsr | null | null | null | 0 | 0 | Entry not found |
dpoudel/lightify | 2023-09-18T12:27:32.000Z | [
"region:us"
] | dpoudel | null | null | null | 0 | 0 | Entry not found |
marasama/nva-Painleve | 2023-09-18T16:43:50.000Z | [
"region:us"
] | marasama | null | null | null | 0 | 0 | Entry not found |
juanberasategui/Master_Thesis_Data | 2023-09-18T12:26:26.000Z | [
"region:us"
] | juanberasategui | null | null | null | 0 | 0 | |
CyberHarem/goto_hitori_bocchitherock | 2023-09-18T12:35:08.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Gotō Hitori
This is the dataset of Gotō Hitori, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 648 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 648 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 648 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 648 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Siddmvl/grisha-example | 2023-09-18T12:34:55.000Z | [
"region:us"
] | Siddmvl | null | null | null | 0 | 0 | Entry not found |
KarthikReddyM/research | 2023-09-18T12:43:12.000Z | [
"region:us"
] | KarthikReddyM | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_marcchew__LaMini-40k-Platypus2-7B | 2023-09-18T12:52:37.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of marcchew/LaMini-40k-Platypus2-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [marcchew/LaMini-40k-Platypus2-7B](https://huggingface.co/marcchew/LaMini-40k-Platypus2-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_marcchew__LaMini-40k-Platypus2-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T12:51:11.107895](https://huggingface.co/datasets/open-llm-leaderboard/details_marcchew__LaMini-40k-Platypus2-7B/blob/main/results_2023-09-18T12-51-11.107895.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2694137303437303,\n\
\ \"acc_stderr\": 0.03177853404824957,\n \"acc_norm\": 0.27050564744456485,\n\
\ \"acc_norm_stderr\": 0.03179466462579956,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.0148161959919316,\n \"mc2\": 0.4738995426146171,\n\
\ \"mc2_stderr\": 0.01665596277298284\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2295221843003413,\n \"acc_stderr\": 0.012288926760890788,\n\
\ \"acc_norm\": 0.28498293515358364,\n \"acc_norm_stderr\": 0.013191348179838793\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25423222465644296,\n\
\ \"acc_stderr\": 0.004345388614520023,\n \"acc_norm\": 0.26319458275243973,\n\
\ \"acc_norm_stderr\": 0.004394671271021432\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.03633384414073461,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.03633384414073461\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3355263157894737,\n \"acc_stderr\": 0.03842498559395268,\n\
\ \"acc_norm\": 0.3355263157894737,\n \"acc_norm_stderr\": 0.03842498559395268\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2981132075471698,\n \"acc_stderr\": 0.028152837942493857,\n\
\ \"acc_norm\": 0.2981132075471698,\n \"acc_norm_stderr\": 0.028152837942493857\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3352601156069364,\n\
\ \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.3352601156069364,\n\
\ \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082633,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082633\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n\
\ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.20851063829787234,\n \"acc_stderr\": 0.026556982117838728,\n\
\ \"acc_norm\": 0.20851063829787234,\n \"acc_norm_stderr\": 0.026556982117838728\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776564,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776564\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n\
\ \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n\
\ \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358609,\n\
\ \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358609\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\"\
: 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"\
acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n\
\ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3641025641025641,\n \"acc_stderr\": 0.02439667298509477,\n \
\ \"acc_norm\": 0.3641025641025641,\n \"acc_norm_stderr\": 0.02439667298509477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3487394957983193,\n \"acc_stderr\": 0.03095663632856655,\n \
\ \"acc_norm\": 0.3487394957983193,\n \"acc_norm_stderr\": 0.03095663632856655\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3486238532110092,\n \"acc_stderr\": 0.020431254090714328,\n \"\
acc_norm\": 0.3486238532110092,\n \"acc_norm_stderr\": 0.020431254090714328\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n\
\ \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.2549019607843137,\n\
\ \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.20253164556962025,\n \"acc_stderr\": 0.026160568246601457,\n\
\ \"acc_norm\": 0.20253164556962025,\n \"acc_norm_stderr\": 0.026160568246601457\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.10762331838565023,\n\
\ \"acc_stderr\": 0.020799400082879997,\n \"acc_norm\": 0.10762331838565023,\n\
\ \"acc_norm_stderr\": 0.020799400082879997\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.14049586776859505,\n \"acc_stderr\": 0.03172233426002161,\n \"\
acc_norm\": 0.14049586776859505,\n \"acc_norm_stderr\": 0.03172233426002161\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.033220157957767414,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.033220157957767414\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16071428571428573,\n\
\ \"acc_stderr\": 0.03485946096475741,\n \"acc_norm\": 0.16071428571428573,\n\
\ \"acc_norm_stderr\": 0.03485946096475741\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n\
\ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20434227330779056,\n\
\ \"acc_stderr\": 0.0144191239809319,\n \"acc_norm\": 0.20434227330779056,\n\
\ \"acc_norm_stderr\": 0.0144191239809319\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757183,\n\
\ \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757183\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.02609016250427905,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.02609016250427905\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24115755627009647,\n\
\ \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.24115755627009647,\n\
\ \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.023246202647819746,\n\
\ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.023246202647819746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24113475177304963,\n \"acc_stderr\": 0.025518731049537762,\n \
\ \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.025518731049537762\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24445893089960888,\n\
\ \"acc_stderr\": 0.010976425013113886,\n \"acc_norm\": 0.24445893089960888,\n\
\ \"acc_norm_stderr\": 0.010976425013113886\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2173202614379085,\n \"acc_stderr\": 0.01668482092914859,\n \
\ \"acc_norm\": 0.2173202614379085,\n \"acc_norm_stderr\": 0.01668482092914859\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072774,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072774\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.26865671641791045,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.26865671641791045,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.1927710843373494,\n\
\ \"acc_stderr\": 0.030709824050565274,\n \"acc_norm\": 0.1927710843373494,\n\
\ \"acc_norm_stderr\": 0.030709824050565274\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.17543859649122806,\n \"acc_stderr\": 0.029170885500727654,\n\
\ \"acc_norm\": 0.17543859649122806,\n \"acc_norm_stderr\": 0.029170885500727654\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.0148161959919316,\n \"mc2\": 0.4738995426146171,\n\
\ \"mc2_stderr\": 0.01665596277298284\n }\n}\n```"
repo_url: https://huggingface.co/marcchew/LaMini-40k-Platypus2-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|arc:challenge|25_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hellaswag|10_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T12-51-11.107895.parquet'
- config_name: results
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- results_2023-09-18T12-51-11.107895.parquet
- split: latest
path:
- results_2023-09-18T12-51-11.107895.parquet
---
# Dataset Card for Evaluation run of marcchew/LaMini-40k-Platypus2-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/marcchew/LaMini-40k-Platypus2-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [marcchew/LaMini-40k-Platypus2-7B](https://huggingface.co/marcchew/LaMini-40k-Platypus2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_marcchew__LaMini-40k-Platypus2-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T12:51:11.107895](https://huggingface.co/datasets/open-llm-leaderboard/details_marcchew__LaMini-40k-Platypus2-7B/blob/main/results_2023-09-18T12-51-11.107895.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2694137303437303,
"acc_stderr": 0.03177853404824957,
"acc_norm": 0.27050564744456485,
"acc_norm_stderr": 0.03179466462579956,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.0148161959919316,
"mc2": 0.4738995426146171,
"mc2_stderr": 0.01665596277298284
},
"harness|arc:challenge|25": {
"acc": 0.2295221843003413,
"acc_stderr": 0.012288926760890788,
"acc_norm": 0.28498293515358364,
"acc_norm_stderr": 0.013191348179838793
},
"harness|hellaswag|10": {
"acc": 0.25423222465644296,
"acc_stderr": 0.004345388614520023,
"acc_norm": 0.26319458275243973,
"acc_norm_stderr": 0.004394671271021432
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073461,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073461
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3355263157894737,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.3355263157894737,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2981132075471698,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.2981132075471698,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3352601156069364,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.3352601156069364,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082633,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082633
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20851063829787234,
"acc_stderr": 0.026556982117838728,
"acc_norm": 0.20851063829787234,
"acc_norm_stderr": 0.026556982117838728
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.022789673145776564,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.022789673145776564
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.03161856335358609,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.03161856335358609
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3641025641025641,
"acc_stderr": 0.02439667298509477,
"acc_norm": 0.3641025641025641,
"acc_norm_stderr": 0.02439667298509477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3487394957983193,
"acc_stderr": 0.03095663632856655,
"acc_norm": 0.3487394957983193,
"acc_norm_stderr": 0.03095663632856655
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3486238532110092,
"acc_stderr": 0.020431254090714328,
"acc_norm": 0.3486238532110092,
"acc_norm_stderr": 0.020431254090714328
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.20253164556962025,
"acc_stderr": 0.026160568246601457,
"acc_norm": 0.20253164556962025,
"acc_norm_stderr": 0.026160568246601457
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.10762331838565023,
"acc_stderr": 0.020799400082879997,
"acc_norm": 0.10762331838565023,
"acc_norm_stderr": 0.020799400082879997
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.14049586776859505,
"acc_stderr": 0.03172233426002161,
"acc_norm": 0.14049586776859505,
"acc_norm_stderr": 0.03172233426002161
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.033220157957767414,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.033220157957767414
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16071428571428573,
"acc_stderr": 0.03485946096475741,
"acc_norm": 0.16071428571428573,
"acc_norm_stderr": 0.03485946096475741
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.20434227330779056,
"acc_stderr": 0.0144191239809319,
"acc_norm": 0.20434227330779056,
"acc_norm_stderr": 0.0144191239809319
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.022075709251757183,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.022075709251757183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.02609016250427905,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.02609016250427905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24115755627009647,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.24115755627009647,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.023246202647819746,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.023246202647819746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24113475177304963,
"acc_stderr": 0.025518731049537762,
"acc_norm": 0.24113475177304963,
"acc_norm_stderr": 0.025518731049537762
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24445893089960888,
"acc_stderr": 0.010976425013113886,
"acc_norm": 0.24445893089960888,
"acc_norm_stderr": 0.010976425013113886
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2173202614379085,
"acc_stderr": 0.01668482092914859,
"acc_norm": 0.2173202614379085,
"acc_norm_stderr": 0.01668482092914859
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072774,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072774
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.26865671641791045,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.26865671641791045,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.1927710843373494,
"acc_stderr": 0.030709824050565274,
"acc_norm": 0.1927710843373494,
"acc_norm_stderr": 0.030709824050565274
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.17543859649122806,
"acc_stderr": 0.029170885500727654,
"acc_norm": 0.17543859649122806,
"acc_norm_stderr": 0.029170885500727654
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.0148161959919316,
"mc2": 0.4738995426146171,
"mc2_stderr": 0.01665596277298284
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
garcianacho/bat_genome | 2023-09-18T12:56:09.000Z | [
"region:us"
] | garcianacho | null | null | null | 0 | 0 | Entry not found |
CyberHarem/ijichi_nijika_bocchitherock | 2023-09-18T13:02:38.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Ijichi Nijika
This is the dataset of Ijichi Nijika, containing 296 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 296 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 684 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 296 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 296 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 296 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 296 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 296 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 684 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 684 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 684 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
TrainingDataPro/miners-detection | 2023-09-29T08:29:36.000Z | [
"task_categories:image-classification",
"task_categories:object-detection",
"language:en",
"license:cc-by-nc-nd-4.0",
"code",
"region:us"
] | TrainingDataPro | The dataset consists of of photos captured within various mines, focusing on **miners**
engaged in their work. Each photo is annotated with bounding box detection of the
miners, an attribute highlights whether each miner is sitting or standing in the photo.
The dataset's diverse applications such as computer vision, safety assessment and others
make it a valuable resource for *researchers, employers, and policymakers in the mining
industry*. | @InProceedings{huggingface:dataset,
title = {miners-detection},
author = {TrainingDataPro},
year = {2023}
} | null | 1 | 0 | ---
language:
- en
license: cc-by-nc-nd-4.0
task_categories:
- image-classification
- object-detection
tags:
- code
dataset_info:
features:
- name: id
dtype: int32
- name: name
dtype: string
- name: image
dtype: image
- name: mask
dtype: image
- name: width
dtype: uint16
- name: height
dtype: uint16
- name: shapes
sequence:
- name: label
dtype:
class_label:
names:
'0': Miner
- name: type
dtype: string
- name: points
sequence:
sequence: float32
- name: rotation
dtype: float32
- name: occluded
dtype: uint8
- name: attributes
sequence:
- name: name
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 5907438
num_examples: 8
download_size: 5795853
dataset_size: 5907438
---
# Miners Detection dataset
The dataset consists of of photos captured within various mines, focusing on **miners** engaged in their work. Each photo is annotated with bounding box detection of the miners, an attribute highlights whether each miner is sitting or standing in the photo.
The dataset's diverse applications such as computer vision, safety assessment and others make it a valuable resource for *researchers, employers, and policymakers in the mining industry*.

# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=miners-detection) to discuss your requirements, learn about the price and buy the dataset.
# Dataset structure
- **images** - contains of original images of miners
- **boxes** - includes bounding box labeling for the original images
- **annotations.xml** - contains coordinates of the bounding boxes and labels, created for the original photo
# Data Format
Each image from `images` folder is accompanied by an XML-annotation in the `annotations.xml` file indicating the coordinates of the bounding boxes for miners detection. For each point, the x and y coordinates are provided. The position of the miner is also provided by the attribute **is_sitting** (true, false).
# Example of XML file structure
.png?generation=1695040600108833&alt=media)
# Miners detection might be made in accordance with your requirements.
## [TrainingData](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=miners-detection) provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** |
BangumiBase/futokunoguild | 2023-09-29T09:08:38.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Futoku No Guild
This is the image base of bangumi Futoku no Guild, we detected 23 characters, 2459 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 192 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 395 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 9 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 5 | [Download](3/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 4 | 155 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 116 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 221 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 48 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 51 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 12 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 261 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 16 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 68 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 12 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 11 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 10 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 27 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 443 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 17 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 227 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 20 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 10 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 133 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
piyushghante/temp | 2023-09-18T13:17:14.000Z | [
"region:us"
] | piyushghante | null | null | null | 0 | 0 | {
"data": [
{
"title": "Colliery Control Order, 2000",
"paragraphs": [
{
"context": "The Colliery Control Order, 2000 was issued by the Government of India in 2000. In exercise of the powers conferred by section 3 read with section 5 of the Essential Commodities Act, 1955 (10 of 1955) and in supersession of the Colliery Control Order, 1945, except as respects things done or omitted to be done before such supersession, the Government of India has issued a Gazette Notification on 1.1.2000 to publish the Colliery Control Order, 2000. The content of the Colliery Control Order, 2000 is given below.\n1. Short title and commencement._ (1) This Order may be called the Colliery Control Order, 2000.\n(2) It shall come into force on the 1st day of January, 2000.\n2. Definitions. _In this Order, unless there is anything repugnant in the subject or context, -\n(a) 'coal' includes anthracite, bituminous coal, lignite, peat and any other form of carbonaceous matter sold or marketed as coal and also coke;\n(b) 'Coal Controller' means the person appointed as such by the Central Government under the provisions of the Coal Controller’s Organisation (Group ‘A’ Posts) Recruitment Rules, 1986;\n(c) 'colliery' means any mine or open working where winning or extraction of coal is the principal object of the mining, quarrying or any other operation carried on therein, and includes a plant for the production of coke or for the washing of coal;\n(d) 'disposal' includes agreeing or offering to dispose of, and the disposal of ownership or any proprietary interest, the right of possession and possession whether or not accompanied by any disposal of ownership or of any proprietary interest or of the right to possession;\n(e) ‘agent’, ‘manager’, and ‘owner’ when used in relation to a colliery shall have the meanings respectively assigned to them in the Mines Act,1952;\n(f) 'size' when used in relation to coal shall have the same specification as given, from time to time, by the Bureau of Indian Standards in their specification number IS:437-1979.",
"qas": [
{
"question": "What is the short title of the Colliery Control Order, 2000?",
"id": "q1",
"answers": [
{
"text": "The Colliery Control Order, 2000",
"answer_start": 181
}
]
},
{
"question": "Under what authority was the Colliery Control Order, 2000 issued by the Government of India?",
"id": "q2",
"answers": [
{
"text": "Essential Commodities Act, 1955",
"answer_start": 85
}
]
},
{
"question": "When was the Colliery Control Order, 2000 published?",
"id": "q3",
"answers": [
{
"text": "1.1.2000",
"answer_start": 212
}
]
},
{
"question": "What is the principal objective of a colliery, as defined in the Order?",
"id": "q4",
"answers": [
{
"text": "winning or extraction of coal",
"answer_start": 299
}
]
},
{
"question": "Who is referred to as the 'Coal Controller' in the context of the Colliery Control Order, 2000?",
"id": "q5",
"answers": [
{
"text": "the person appointed as such by the Central Government under the provisions of the Coal Controller’s Organisation Recruitment Rules, 1986",
"answer_start": 377
}
]
},
{
"question": "What types of carbonaceous matter are included in the definition of 'coal' in this Order?",
"id": "q6",
"answers": [
{
"text": "anthracite, bituminous coal, lignite, peat, and any other form of carbonaceous matter sold or marketed as coal, as well as coke",
"answer_start": 424
}
]
},
{
"question": "What is the significance of size in relation to coal according to the Order?",
"id": "q7",
"answers": [
{
"text": "specified by the Bureau of Indian Standards in their specification number IS:437-1979",
"answer_start": 532
}
]
},
{
"question": "How is the categorization of coal into classes, grades, and sizes determined?",
"id": "q8",
"answers": [
{
"text": "determined by the Central Government through notifications in the Official Gazette",
"answer_start": 600
}
]
},
{
"question": "Who is responsible for laying down the procedure and method of sampling and analysis of coal for grade maintenance in a colliery?",
"id": "q9",
"answers": [
{
"text": "The Coal Controller",
"answer_start": 727
}
]
},
{
"question": "What is the procedure for resolving disputes between a consumer and the owner of a colliery regarding the declaration of grades of coal?",
"id": "q10",
"answers": [
{
"text": "Disputes regarding the declaration of grades of coal may be referred to the Coal Controller, and the decision of the Coal Controller shall be binding on the owner of the colliery. A memorandum of reference to the Coal Controller regarding such disputes should be accompanied by a fee as specified by the Coal Controller.",
"answer_start": 855
}
]
}
]
}
]
}
]
} |
open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4 | 2023-09-18T13:15:34.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of wei123602/Llama-2-13b-FINETUNE4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wei123602/Llama-2-13b-FINETUNE4](https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T13:14:12.416583](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4/blob/main/results_2023-09-18T13-14-12.416583.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5721730477835751,\n\
\ \"acc_stderr\": 0.03435830007549653,\n \"acc_norm\": 0.5765179420517365,\n\
\ \"acc_norm_stderr\": 0.03433789840389022,\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361005,\n \"mc2\": 0.432555793209622,\n\
\ \"mc2_stderr\": 0.014584160007096517\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5401023890784983,\n \"acc_stderr\": 0.01456431885692485,\n\
\ \"acc_norm\": 0.5870307167235495,\n \"acc_norm_stderr\": 0.014388344935398329\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.609838677554272,\n\
\ \"acc_stderr\": 0.004867893927258144,\n \"acc_norm\": 0.819259111730731,\n\
\ \"acc_norm_stderr\": 0.0038401692240122715\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03015113445777629,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03015113445777629\n },\n\
\ \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\
\ \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.5486111111111112,\n\
\ \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n\
\ \"acc_stderr\": 0.03765746693865151,\n \"acc_norm\": 0.5780346820809249,\n\
\ \"acc_norm_stderr\": 0.03765746693865151\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.044895393502706986,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.044895393502706986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.335978835978836,\n \"acc_stderr\": 0.024326310529149138,\n \"\
acc_norm\": 0.335978835978836,\n \"acc_norm_stderr\": 0.024326310529149138\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.632258064516129,\n\
\ \"acc_stderr\": 0.02743086657997347,\n \"acc_norm\": 0.632258064516129,\n\
\ \"acc_norm_stderr\": 0.02743086657997347\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.03486731727419872,\n\
\ \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.03486731727419872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7070707070707071,\n \"acc_stderr\": 0.03242497958178816,\n \"\
acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.03242497958178816\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.02951928261681723,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.02951928261681723\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.02483881198803316,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.02483881198803316\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.02918571494985741,\n\
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.02918571494985741\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6134453781512605,\n \"acc_stderr\": 0.03163145807552379,\n \
\ \"acc_norm\": 0.6134453781512605,\n \"acc_norm_stderr\": 0.03163145807552379\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7761467889908257,\n \"acc_stderr\": 0.017871217767790215,\n \"\
acc_norm\": 0.7761467889908257,\n \"acc_norm_stderr\": 0.017871217767790215\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.034006036255382704,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.034006036255382704\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.043285772152629715,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.043285772152629715\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.03559039531617342,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.03559039531617342\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8034188034188035,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.8034188034188035,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.776500638569604,\n\
\ \"acc_stderr\": 0.01489723522945071,\n \"acc_norm\": 0.776500638569604,\n\
\ \"acc_norm_stderr\": 0.01489723522945071\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.026483392042098177,\n\
\ \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.026483392042098177\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4480446927374302,\n\
\ \"acc_stderr\": 0.016631976628930595,\n \"acc_norm\": 0.4480446927374302,\n\
\ \"acc_norm_stderr\": 0.016631976628930595\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5751633986928104,\n \"acc_stderr\": 0.02830457667314111,\n\
\ \"acc_norm\": 0.5751633986928104,\n \"acc_norm_stderr\": 0.02830457667314111\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n\
\ \"acc_stderr\": 0.026920841260776165,\n \"acc_norm\": 0.6591639871382636,\n\
\ \"acc_norm_stderr\": 0.026920841260776165\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.026406145973625672,\n\
\ \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.026406145973625672\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44132985658409385,\n\
\ \"acc_stderr\": 0.01268201633564667,\n \"acc_norm\": 0.44132985658409385,\n\
\ \"acc_norm_stderr\": 0.01268201633564667\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5772058823529411,\n \"acc_stderr\": 0.030008562845003476,\n\
\ \"acc_norm\": 0.5772058823529411,\n \"acc_norm_stderr\": 0.030008562845003476\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5833333333333334,\n \"acc_stderr\": 0.01994491413687358,\n \
\ \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.01994491413687358\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5469387755102041,\n \"acc_stderr\": 0.03186785930004128,\n\
\ \"acc_norm\": 0.5469387755102041,\n \"acc_norm_stderr\": 0.03186785930004128\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932264,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932264\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079023,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079023\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533193,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533193\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361005,\n \"mc2\": 0.432555793209622,\n\
\ \"mc2_stderr\": 0.014584160007096517\n }\n}\n```"
repo_url: https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-14-12.416583.parquet'
- config_name: results
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- results_2023-09-18T13-14-12.416583.parquet
- split: latest
path:
- results_2023-09-18T13-14-12.416583.parquet
---
# Dataset Card for Evaluation run of wei123602/Llama-2-13b-FINETUNE4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [wei123602/Llama-2-13b-FINETUNE4](https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T13:14:12.416583](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4/blob/main/results_2023-09-18T13-14-12.416583.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5721730477835751,
"acc_stderr": 0.03435830007549653,
"acc_norm": 0.5765179420517365,
"acc_norm_stderr": 0.03433789840389022,
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361005,
"mc2": 0.432555793209622,
"mc2_stderr": 0.014584160007096517
},
"harness|arc:challenge|25": {
"acc": 0.5401023890784983,
"acc_stderr": 0.01456431885692485,
"acc_norm": 0.5870307167235495,
"acc_norm_stderr": 0.014388344935398329
},
"harness|hellaswag|10": {
"acc": 0.609838677554272,
"acc_stderr": 0.004867893927258144,
"acc_norm": 0.819259111730731,
"acc_norm_stderr": 0.0038401692240122715
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6,
"acc_stderr": 0.03015113445777629,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03015113445777629
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.04161402398403279,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.04161402398403279
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.03765746693865151,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.03765746693865151
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502706986,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502706986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.335978835978836,
"acc_stderr": 0.024326310529149138,
"acc_norm": 0.335978835978836,
"acc_norm_stderr": 0.024326310529149138
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.632258064516129,
"acc_stderr": 0.02743086657997347,
"acc_norm": 0.632258064516129,
"acc_norm_stderr": 0.02743086657997347
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.03486731727419872,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.03486731727419872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.03242497958178816,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.03242497958178816
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.02951928261681723,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.02951928261681723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6,
"acc_stderr": 0.02483881198803316,
"acc_norm": 0.6,
"acc_norm_stderr": 0.02483881198803316
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.02918571494985741,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.02918571494985741
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6134453781512605,
"acc_stderr": 0.03163145807552379,
"acc_norm": 0.6134453781512605,
"acc_norm_stderr": 0.03163145807552379
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7761467889908257,
"acc_stderr": 0.017871217767790215,
"acc_norm": 0.7761467889908257,
"acc_norm_stderr": 0.017871217767790215
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.034006036255382704,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.034006036255382704
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.043285772152629715,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.043285772152629715
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8034188034188035,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.8034188034188035,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.776500638569604,
"acc_stderr": 0.01489723522945071,
"acc_norm": 0.776500638569604,
"acc_norm_stderr": 0.01489723522945071
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.026483392042098177,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.026483392042098177
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4480446927374302,
"acc_stderr": 0.016631976628930595,
"acc_norm": 0.4480446927374302,
"acc_norm_stderr": 0.016631976628930595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5751633986928104,
"acc_stderr": 0.02830457667314111,
"acc_norm": 0.5751633986928104,
"acc_norm_stderr": 0.02830457667314111
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6591639871382636,
"acc_stderr": 0.026920841260776165,
"acc_norm": 0.6591639871382636,
"acc_norm_stderr": 0.026920841260776165
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.026406145973625672,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.026406145973625672
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44132985658409385,
"acc_stderr": 0.01268201633564667,
"acc_norm": 0.44132985658409385,
"acc_norm_stderr": 0.01268201633564667
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5772058823529411,
"acc_stderr": 0.030008562845003476,
"acc_norm": 0.5772058823529411,
"acc_norm_stderr": 0.030008562845003476
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.01994491413687358,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.01994491413687358
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5469387755102041,
"acc_stderr": 0.03186785930004128,
"acc_norm": 0.5469387755102041,
"acc_norm_stderr": 0.03186785930004128
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7213930348258707,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.7213930348258707,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932264,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932264
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079023,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079023
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533193,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533193
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361005,
"mc2": 0.432555793209622,
"mc2_stderr": 0.014584160007096517
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Dampish__StellarX-4B-V0.2 | 2023-09-18T13:17:41.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Dampish/StellarX-4B-V0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Dampish/StellarX-4B-V0.2](https://huggingface.co/Dampish/StellarX-4B-V0.2) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Dampish__StellarX-4B-V0.2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T13:16:25.972049](https://huggingface.co/datasets/open-llm-leaderboard/details_Dampish__StellarX-4B-V0.2/blob/main/results_2023-09-18T13-16-25.972049.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2592434520324127,\n\
\ \"acc_stderr\": 0.03173429851074048,\n \"acc_norm\": 0.2623040227525283,\n\
\ \"acc_norm_stderr\": 0.03173986420447102,\n \"mc1\": 0.21664626682986537,\n\
\ \"mc1_stderr\": 0.014421468452506985,\n \"mc2\": 0.38554214033294804,\n\
\ \"mc2_stderr\": 0.014744935679888965\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3174061433447099,\n \"acc_stderr\": 0.01360223908803817,\n\
\ \"acc_norm\": 0.3464163822525597,\n \"acc_norm_stderr\": 0.013905011180063246\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.41585341565425216,\n\
\ \"acc_stderr\": 0.004918612098944041,\n \"acc_norm\": 0.5674168492332204,\n\
\ \"acc_norm_stderr\": 0.00494421593702139\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3037037037037037,\n\
\ \"acc_stderr\": 0.039725528847851375,\n \"acc_norm\": 0.3037037037037037,\n\
\ \"acc_norm_stderr\": 0.039725528847851375\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3026315789473684,\n \"acc_stderr\": 0.037385206761196665,\n\
\ \"acc_norm\": 0.3026315789473684,\n \"acc_norm_stderr\": 0.037385206761196665\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.02461829819586651,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.02461829819586651\n },\n\
\ \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n\
\ \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.24855491329479767,\n\
\ \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.0261488180184245,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.0261488180184245\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309993,\n\
\ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309993\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577656,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577656\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
\ \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n\
\ \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.25161290322580643,\n \"acc_stderr\": 0.024685979286239956,\n \"\
acc_norm\": 0.25161290322580643,\n \"acc_norm_stderr\": 0.024685979286239956\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.270935960591133,\n \"acc_stderr\": 0.03127090713297698,\n \"acc_norm\"\
: 0.270935960591133,\n \"acc_norm_stderr\": 0.03127090713297698\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2474747474747475,\n \"acc_stderr\": 0.030746300742124488,\n \"\
acc_norm\": 0.2474747474747475,\n \"acc_norm_stderr\": 0.030746300742124488\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2849740932642487,\n \"acc_stderr\": 0.032577140777096614,\n\
\ \"acc_norm\": 0.2849740932642487,\n \"acc_norm_stderr\": 0.032577140777096614\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2282051282051282,\n \"acc_stderr\": 0.02127839386358628,\n \
\ \"acc_norm\": 0.2282051282051282,\n \"acc_norm_stderr\": 0.02127839386358628\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.25137614678899084,\n\
\ \"acc_stderr\": 0.01859920636028741,\n \"acc_norm\": 0.25137614678899084,\n\
\ \"acc_norm_stderr\": 0.01859920636028741\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.028353212866863448,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.028353212866863448\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350194,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350194\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658335,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658335\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.18834080717488788,\n\
\ \"acc_stderr\": 0.02624113299640727,\n \"acc_norm\": 0.18834080717488788,\n\
\ \"acc_norm_stderr\": 0.02624113299640727\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n\
\ \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n\
\ \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26436781609195403,\n\
\ \"acc_stderr\": 0.01576998484069053,\n \"acc_norm\": 0.26436781609195403,\n\
\ \"acc_norm_stderr\": 0.01576998484069053\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.29190751445086704,\n \"acc_stderr\": 0.02447699407624734,\n\
\ \"acc_norm\": 0.29190751445086704,\n \"acc_norm_stderr\": 0.02447699407624734\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3215434083601286,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.3215434083601286,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.02447722285613511,\n\
\ \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.02447722285613511\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590638,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590638\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27053455019556716,\n\
\ \"acc_stderr\": 0.011345996743539265,\n \"acc_norm\": 0.27053455019556716,\n\
\ \"acc_norm_stderr\": 0.011345996743539265\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.16911764705882354,\n \"acc_stderr\": 0.022770868010113025,\n\
\ \"acc_norm\": 0.16911764705882354,\n \"acc_norm_stderr\": 0.022770868010113025\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.27450980392156865,\n \"acc_stderr\": 0.018054027458815194,\n \
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.018054027458815194\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2571428571428571,\n \"acc_stderr\": 0.027979823538744546,\n\
\ \"acc_norm\": 0.2571428571428571,\n \"acc_norm_stderr\": 0.027979823538744546\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.24096385542168675,\n\
\ \"acc_stderr\": 0.03329394119073529,\n \"acc_norm\": 0.24096385542168675,\n\
\ \"acc_norm_stderr\": 0.03329394119073529\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n\
\ \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21664626682986537,\n\
\ \"mc1_stderr\": 0.014421468452506985,\n \"mc2\": 0.38554214033294804,\n\
\ \"mc2_stderr\": 0.014744935679888965\n }\n}\n```"
repo_url: https://huggingface.co/Dampish/StellarX-4B-V0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-16-25.972049.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-16-25.972049.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-16-25.972049.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-16-25.972049.parquet'
- config_name: results
data_files:
- split: 2023_09_18T13_16_25.972049
path:
- results_2023-09-18T13-16-25.972049.parquet
- split: latest
path:
- results_2023-09-18T13-16-25.972049.parquet
---
# Dataset Card for Evaluation run of Dampish/StellarX-4B-V0.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Dampish/StellarX-4B-V0.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Dampish/StellarX-4B-V0.2](https://huggingface.co/Dampish/StellarX-4B-V0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Dampish__StellarX-4B-V0.2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T13:16:25.972049](https://huggingface.co/datasets/open-llm-leaderboard/details_Dampish__StellarX-4B-V0.2/blob/main/results_2023-09-18T13-16-25.972049.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2592434520324127,
"acc_stderr": 0.03173429851074048,
"acc_norm": 0.2623040227525283,
"acc_norm_stderr": 0.03173986420447102,
"mc1": 0.21664626682986537,
"mc1_stderr": 0.014421468452506985,
"mc2": 0.38554214033294804,
"mc2_stderr": 0.014744935679888965
},
"harness|arc:challenge|25": {
"acc": 0.3174061433447099,
"acc_stderr": 0.01360223908803817,
"acc_norm": 0.3464163822525597,
"acc_norm_stderr": 0.013905011180063246
},
"harness|hellaswag|10": {
"acc": 0.41585341565425216,
"acc_stderr": 0.004918612098944041,
"acc_norm": 0.5674168492332204,
"acc_norm_stderr": 0.00494421593702139
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.039725528847851375,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.039725528847851375
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3026315789473684,
"acc_stderr": 0.037385206761196665,
"acc_norm": 0.3026315789473684,
"acc_norm_stderr": 0.037385206761196665
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2,
"acc_stderr": 0.02461829819586651,
"acc_norm": 0.2,
"acc_norm_stderr": 0.02461829819586651
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2,
"acc_stderr": 0.0261488180184245,
"acc_norm": 0.2,
"acc_norm_stderr": 0.0261488180184245
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.03806142687309993,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.03806142687309993
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577656,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577656
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.024685979286239956,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.024685979286239956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.03127090713297698,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.03127090713297698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2474747474747475,
"acc_stderr": 0.030746300742124488,
"acc_norm": 0.2474747474747475,
"acc_norm_stderr": 0.030746300742124488
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2849740932642487,
"acc_stderr": 0.032577140777096614,
"acc_norm": 0.2849740932642487,
"acc_norm_stderr": 0.032577140777096614
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2282051282051282,
"acc_stderr": 0.02127839386358628,
"acc_norm": 0.2282051282051282,
"acc_norm_stderr": 0.02127839386358628
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25137614678899084,
"acc_stderr": 0.01859920636028741,
"acc_norm": 0.25137614678899084,
"acc_norm_stderr": 0.01859920636028741
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.028353212866863448,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.028353212866863448
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.03019028245350194,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.03019028245350194
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658335,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658335
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.18834080717488788,
"acc_stderr": 0.02624113299640727,
"acc_norm": 0.18834080717488788,
"acc_norm_stderr": 0.02624113299640727
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26436781609195403,
"acc_stderr": 0.01576998484069053,
"acc_norm": 0.26436781609195403,
"acc_norm_stderr": 0.01576998484069053
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.29190751445086704,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.29190751445086704,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3215434083601286,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.3215434083601286,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2623456790123457,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.2623456790123457,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590638,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590638
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27053455019556716,
"acc_stderr": 0.011345996743539265,
"acc_norm": 0.27053455019556716,
"acc_norm_stderr": 0.011345996743539265
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16911764705882354,
"acc_stderr": 0.022770868010113025,
"acc_norm": 0.16911764705882354,
"acc_norm_stderr": 0.022770868010113025
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.018054027458815194,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.018054027458815194
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2571428571428571,
"acc_stderr": 0.027979823538744546,
"acc_norm": 0.2571428571428571,
"acc_norm_stderr": 0.027979823538744546
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.24096385542168675,
"acc_stderr": 0.03329394119073529,
"acc_norm": 0.24096385542168675,
"acc_norm_stderr": 0.03329394119073529
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21664626682986537,
"mc1_stderr": 0.014421468452506985,
"mc2": 0.38554214033294804,
"mc2_stderr": 0.014744935679888965
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
DeepSpiral/x1134 | 2023-09-18T13:38:30.000Z | [
"region:us"
] | DeepSpiral | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_PY007__TinyLlama-1.1B-intermediate-step-240k-503b | 2023-09-18T13:33:01.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PY007/TinyLlama-1.1B-intermediate-step-240k-503b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PY007/TinyLlama-1.1B-intermediate-step-240k-503b](https://huggingface.co/PY007/TinyLlama-1.1B-intermediate-step-240k-503b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PY007__TinyLlama-1.1B-intermediate-step-240k-503b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T13:31:42.519724](https://huggingface.co/datasets/open-llm-leaderboard/details_PY007__TinyLlama-1.1B-intermediate-step-240k-503b/blob/main/results_2023-09-18T13-31-42.519724.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2646119045336586,\n\
\ \"acc_stderr\": 0.03201502691895088,\n \"acc_norm\": 0.26706290206612565,\n\
\ \"acc_norm_stderr\": 0.03202552691519412,\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.014896277441041834,\n \"mc2\": 0.40170337518856825,\n\
\ \"mc2_stderr\": 0.014421505165350026\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2593856655290102,\n \"acc_stderr\": 0.012808273573927106,\n\
\ \"acc_norm\": 0.29266211604095566,\n \"acc_norm_stderr\": 0.01329591610361942\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3857797251543517,\n\
\ \"acc_stderr\": 0.004857840934549167,\n \"acc_norm\": 0.4971121290579566,\n\
\ \"acc_norm_stderr\": 0.004989698183207835\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.1925925925925926,\n\
\ \"acc_stderr\": 0.03406542058502652,\n \"acc_norm\": 0.1925925925925926,\n\
\ \"acc_norm_stderr\": 0.03406542058502652\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.20394736842105263,\n \"acc_stderr\": 0.032790004063100515,\n\
\ \"acc_norm\": 0.20394736842105263,\n \"acc_norm_stderr\": 0.032790004063100515\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.25660377358490566,\n \"acc_stderr\": 0.026880647889051975,\n\
\ \"acc_norm\": 0.25660377358490566,\n \"acc_norm_stderr\": 0.026880647889051975\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3472222222222222,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.3472222222222222,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n\
\ \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.24855491329479767,\n\
\ \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2680851063829787,\n \"acc_stderr\": 0.02895734278834235,\n\
\ \"acc_norm\": 0.2680851063829787,\n \"acc_norm_stderr\": 0.02895734278834235\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.03892431106518753,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.03892431106518753\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.03664666337225256,\n\
\ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.03664666337225256\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.023068188848261107,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.023068188848261107\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.03764950879790607,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.03764950879790607\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1935483870967742,\n\
\ \"acc_stderr\": 0.02247525852553606,\n \"acc_norm\": 0.1935483870967742,\n\
\ \"acc_norm_stderr\": 0.02247525852553606\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.19704433497536947,\n \"acc_stderr\": 0.02798672466673621,\n\
\ \"acc_norm\": 0.19704433497536947,\n \"acc_norm_stderr\": 0.02798672466673621\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.19393939393939394,\n \"acc_stderr\": 0.0308741451365621,\n\
\ \"acc_norm\": 0.19393939393939394,\n \"acc_norm_stderr\": 0.0308741451365621\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20707070707070707,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.20707070707070707,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3316062176165803,\n \"acc_stderr\": 0.03397636541089116,\n\
\ \"acc_norm\": 0.3316062176165803,\n \"acc_norm_stderr\": 0.03397636541089116\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2794871794871795,\n \"acc_stderr\": 0.022752388839776826,\n\
\ \"acc_norm\": 0.2794871794871795,\n \"acc_norm_stderr\": 0.022752388839776826\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948482,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948482\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.027205371538279493,\n \
\ \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.027205371538279493\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.03543304234389985,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.03543304234389985\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23486238532110093,\n \"acc_stderr\": 0.018175110510343585,\n \"\
acc_norm\": 0.23486238532110093,\n \"acc_norm_stderr\": 0.018175110510343585\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.35185185185185186,\n \"acc_stderr\": 0.03256850570293648,\n \"\
acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.03256850570293648\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.27450980392156865,\n \"acc_stderr\": 0.031321798030832904,\n \"\
acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.031321798030832904\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.26905829596412556,\n\
\ \"acc_stderr\": 0.02976377940687497,\n \"acc_norm\": 0.26905829596412556,\n\
\ \"acc_norm_stderr\": 0.02976377940687497\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2727272727272727,\n \"acc_stderr\": 0.04065578140908706,\n \"\
acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04065578140908706\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.0413311944024384,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.0413311944024384\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n\
\ \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n\
\ \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.045821241601615506,\n\
\ \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.045821241601615506\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.029745048572674047,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.029745048572674047\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26436781609195403,\n\
\ \"acc_stderr\": 0.01576998484069052,\n \"acc_norm\": 0.26436781609195403,\n\
\ \"acc_norm_stderr\": 0.01576998484069052\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.023083658586984204,\n\
\ \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.023083658586984204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2508038585209003,\n\
\ \"acc_stderr\": 0.024619771956697168,\n \"acc_norm\": 0.2508038585209003,\n\
\ \"acc_norm_stderr\": 0.024619771956697168\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843007,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843007\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25684485006518903,\n\
\ \"acc_stderr\": 0.011158455853098858,\n \"acc_norm\": 0.25684485006518903,\n\
\ \"acc_norm_stderr\": 0.011158455853098858\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2757352941176471,\n \"acc_stderr\": 0.027146271936625166,\n\
\ \"acc_norm\": 0.2757352941176471,\n \"acc_norm_stderr\": 0.027146271936625166\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2761437908496732,\n \"acc_stderr\": 0.018087276935663137,\n \
\ \"acc_norm\": 0.2761437908496732,\n \"acc_norm_stderr\": 0.018087276935663137\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.30612244897959184,\n \"acc_stderr\": 0.029504896454595975,\n\
\ \"acc_norm\": 0.30612244897959184,\n \"acc_norm_stderr\": 0.029504896454595975\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2935323383084577,\n\
\ \"acc_stderr\": 0.032200241045342054,\n \"acc_norm\": 0.2935323383084577,\n\
\ \"acc_norm_stderr\": 0.032200241045342054\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.034605799075530276,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.034605799075530276\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.035087719298245626,\n\
\ \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.035087719298245626\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.014896277441041834,\n \"mc2\": 0.40170337518856825,\n\
\ \"mc2_stderr\": 0.014421505165350026\n }\n}\n```"
repo_url: https://huggingface.co/PY007/TinyLlama-1.1B-intermediate-step-240k-503b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-31-42.519724.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-31-42.519724.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-31-42.519724.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-31-42.519724.parquet'
- config_name: results
data_files:
- split: 2023_09_18T13_31_42.519724
path:
- results_2023-09-18T13-31-42.519724.parquet
- split: latest
path:
- results_2023-09-18T13-31-42.519724.parquet
---
# Dataset Card for Evaluation run of PY007/TinyLlama-1.1B-intermediate-step-240k-503b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PY007/TinyLlama-1.1B-intermediate-step-240k-503b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PY007/TinyLlama-1.1B-intermediate-step-240k-503b](https://huggingface.co/PY007/TinyLlama-1.1B-intermediate-step-240k-503b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PY007__TinyLlama-1.1B-intermediate-step-240k-503b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T13:31:42.519724](https://huggingface.co/datasets/open-llm-leaderboard/details_PY007__TinyLlama-1.1B-intermediate-step-240k-503b/blob/main/results_2023-09-18T13-31-42.519724.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2646119045336586,
"acc_stderr": 0.03201502691895088,
"acc_norm": 0.26706290206612565,
"acc_norm_stderr": 0.03202552691519412,
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041834,
"mc2": 0.40170337518856825,
"mc2_stderr": 0.014421505165350026
},
"harness|arc:challenge|25": {
"acc": 0.2593856655290102,
"acc_stderr": 0.012808273573927106,
"acc_norm": 0.29266211604095566,
"acc_norm_stderr": 0.01329591610361942
},
"harness|hellaswag|10": {
"acc": 0.3857797251543517,
"acc_stderr": 0.004857840934549167,
"acc_norm": 0.4971121290579566,
"acc_norm_stderr": 0.004989698183207835
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.1925925925925926,
"acc_stderr": 0.03406542058502652,
"acc_norm": 0.1925925925925926,
"acc_norm_stderr": 0.03406542058502652
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.20394736842105263,
"acc_stderr": 0.032790004063100515,
"acc_norm": 0.20394736842105263,
"acc_norm_stderr": 0.032790004063100515
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.25660377358490566,
"acc_stderr": 0.026880647889051975,
"acc_norm": 0.25660377358490566,
"acc_norm_stderr": 0.026880647889051975
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2680851063829787,
"acc_stderr": 0.02895734278834235,
"acc_norm": 0.2680851063829787,
"acc_norm_stderr": 0.02895734278834235
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518753,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518753
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.03664666337225256,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.03664666337225256
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.023068188848261107,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.023068188848261107
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790607,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790607
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1935483870967742,
"acc_stderr": 0.02247525852553606,
"acc_norm": 0.1935483870967742,
"acc_norm_stderr": 0.02247525852553606
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.19704433497536947,
"acc_stderr": 0.02798672466673621,
"acc_norm": 0.19704433497536947,
"acc_norm_stderr": 0.02798672466673621
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.19393939393939394,
"acc_stderr": 0.0308741451365621,
"acc_norm": 0.19393939393939394,
"acc_norm_stderr": 0.0308741451365621
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20707070707070707,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.20707070707070707,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3316062176165803,
"acc_stderr": 0.03397636541089116,
"acc_norm": 0.3316062176165803,
"acc_norm_stderr": 0.03397636541089116
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2794871794871795,
"acc_stderr": 0.022752388839776826,
"acc_norm": 0.2794871794871795,
"acc_norm_stderr": 0.022752388839776826
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948482,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.226890756302521,
"acc_stderr": 0.027205371538279493,
"acc_norm": 0.226890756302521,
"acc_norm_stderr": 0.027205371538279493
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.03543304234389985,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.03543304234389985
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23486238532110093,
"acc_stderr": 0.018175110510343585,
"acc_norm": 0.23486238532110093,
"acc_norm_stderr": 0.018175110510343585
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.03256850570293648,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.03256850570293648
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.031321798030832904,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.031321798030832904
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.26905829596412556,
"acc_stderr": 0.02976377940687497,
"acc_norm": 0.26905829596412556,
"acc_norm_stderr": 0.02976377940687497
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04065578140908706,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04065578140908706
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.0413311944024384,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.0413311944024384
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.045821241601615506,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.045821241601615506
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.029745048572674047,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.029745048572674047
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26436781609195403,
"acc_stderr": 0.01576998484069052,
"acc_norm": 0.26436781609195403,
"acc_norm_stderr": 0.01576998484069052
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2508038585209003,
"acc_stderr": 0.024619771956697168,
"acc_norm": 0.2508038585209003,
"acc_norm_stderr": 0.024619771956697168
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843007,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843007
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25684485006518903,
"acc_stderr": 0.011158455853098858,
"acc_norm": 0.25684485006518903,
"acc_norm_stderr": 0.011158455853098858
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2757352941176471,
"acc_stderr": 0.027146271936625166,
"acc_norm": 0.2757352941176471,
"acc_norm_stderr": 0.027146271936625166
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2761437908496732,
"acc_stderr": 0.018087276935663137,
"acc_norm": 0.2761437908496732,
"acc_norm_stderr": 0.018087276935663137
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.30612244897959184,
"acc_stderr": 0.029504896454595975,
"acc_norm": 0.30612244897959184,
"acc_norm_stderr": 0.029504896454595975
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2935323383084577,
"acc_stderr": 0.032200241045342054,
"acc_norm": 0.2935323383084577,
"acc_norm_stderr": 0.032200241045342054
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.034605799075530276,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.034605799075530276
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.035087719298245626,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.035087719298245626
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041834,
"mc2": 0.40170337518856825,
"mc2_stderr": 0.014421505165350026
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ssahir/gtzan_all_preprocessed | 2023-09-18T13:40:52.000Z | [
"region:us"
] | ssahir | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: label
dtype:
class_label:
names:
'0': blues
'1': classical
'2': country
'3': disco
'4': hiphop
'5': jazz
'6': metal
'7': pop
'8': reggae
'9': rock
- name: input_values
sequence: float32
- name: attention_mask
sequence: int32
splits:
- name: train
num_bytes: 3452159816
num_examples: 899
- name: test
num_bytes: 384000696
num_examples: 100
download_size: 1923103923
dataset_size: 3836160512
---
# Dataset Card for "gtzan_all_preprocessed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Undi95__MLewd-L2-Chat-13B | 2023-09-18T13:39:50.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Undi95/MLewd-L2-Chat-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/MLewd-L2-Chat-13B](https://huggingface.co/Undi95/MLewd-L2-Chat-13B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__MLewd-L2-Chat-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T13:38:28.135797](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewd-L2-Chat-13B/blob/main/results_2023-09-18T13-38-28.135797.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5886983442536506,\n\
\ \"acc_stderr\": 0.03393248935524807,\n \"acc_norm\": 0.5923771951565636,\n\
\ \"acc_norm_stderr\": 0.033910992268385266,\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.016997627871907926,\n \"mc2\": 0.5283543922925904,\n\
\ \"mc2_stderr\": 0.015514015586882196\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5998293515358362,\n \"acc_stderr\": 0.014317197787809174,\n\
\ \"acc_norm\": 0.6203071672354948,\n \"acc_norm_stderr\": 0.014182119866974872\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6452897829117705,\n\
\ \"acc_stderr\": 0.004774476498238617,\n \"acc_norm\": 0.8418641704839673,\n\
\ \"acc_norm_stderr\": 0.003641226294167795\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237103,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237103\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.04017901275981749,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.04017901275981749\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365245,\n\
\ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365245\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.04096985139843671,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.04096985139843671\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3492063492063492,\n \"acc_stderr\": 0.024552292209342668,\n \"\
acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.024552292209342668\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6935483870967742,\n \"acc_stderr\": 0.026226485652553887,\n \"\
acc_norm\": 0.6935483870967742,\n \"acc_norm_stderr\": 0.026226485652553887\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\"\
: 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.035014387062967806,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.035014387062967806\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365907,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365907\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.02541634309630644,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.02541634309630644\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5794871794871795,\n \"acc_stderr\": 0.02502861027671086,\n \
\ \"acc_norm\": 0.5794871794871795,\n \"acc_norm_stderr\": 0.02502861027671086\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131137,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131137\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413926,\n \
\ \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413926\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7743119266055046,\n \"acc_stderr\": 0.017923087667803064,\n \"\
acc_norm\": 0.7743119266055046,\n \"acc_norm_stderr\": 0.017923087667803064\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608044,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608044\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415927,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415927\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.041184385658062976,\n\
\ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.041184385658062976\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260594,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260594\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8247863247863247,\n\
\ \"acc_stderr\": 0.02490443909891824,\n \"acc_norm\": 0.8247863247863247,\n\
\ \"acc_norm_stderr\": 0.02490443909891824\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n\
\ \"acc_stderr\": 0.014836205167333555,\n \"acc_norm\": 0.7790549169859514,\n\
\ \"acc_norm_stderr\": 0.014836205167333555\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.025816756791584197,\n\
\ \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.025816756791584197\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5027932960893855,\n\
\ \"acc_stderr\": 0.016722240595491714,\n \"acc_norm\": 0.5027932960893855,\n\
\ \"acc_norm_stderr\": 0.016722240595491714\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.027363593284684965,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.027363593284684965\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\
\ \"acc_stderr\": 0.02731684767419271,\n \"acc_norm\": 0.6366559485530546,\n\
\ \"acc_norm_stderr\": 0.02731684767419271\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6604938271604939,\n \"acc_stderr\": 0.026348564412011628,\n\
\ \"acc_norm\": 0.6604938271604939,\n \"acc_norm_stderr\": 0.026348564412011628\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43741851368970014,\n\
\ \"acc_stderr\": 0.012669813464935726,\n \"acc_norm\": 0.43741851368970014,\n\
\ \"acc_norm_stderr\": 0.012669813464935726\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5698529411764706,\n \"acc_stderr\": 0.030074971917302875,\n\
\ \"acc_norm\": 0.5698529411764706,\n \"acc_norm_stderr\": 0.030074971917302875\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5915032679738562,\n \"acc_stderr\": 0.019886221037501862,\n \
\ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.019886221037501862\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n\
\ \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n\
\ \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.7661691542288557,\n\
\ \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.016997627871907926,\n \"mc2\": 0.5283543922925904,\n\
\ \"mc2_stderr\": 0.015514015586882196\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/MLewd-L2-Chat-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-38-28.135797.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-38-28.135797.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-38-28.135797.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-38-28.135797.parquet'
- config_name: results
data_files:
- split: 2023_09_18T13_38_28.135797
path:
- results_2023-09-18T13-38-28.135797.parquet
- split: latest
path:
- results_2023-09-18T13-38-28.135797.parquet
---
# Dataset Card for Evaluation run of Undi95/MLewd-L2-Chat-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/MLewd-L2-Chat-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/MLewd-L2-Chat-13B](https://huggingface.co/Undi95/MLewd-L2-Chat-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__MLewd-L2-Chat-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T13:38:28.135797](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewd-L2-Chat-13B/blob/main/results_2023-09-18T13-38-28.135797.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5886983442536506,
"acc_stderr": 0.03393248935524807,
"acc_norm": 0.5923771951565636,
"acc_norm_stderr": 0.033910992268385266,
"mc1": 0.3806609547123623,
"mc1_stderr": 0.016997627871907926,
"mc2": 0.5283543922925904,
"mc2_stderr": 0.015514015586882196
},
"harness|arc:challenge|25": {
"acc": 0.5998293515358362,
"acc_stderr": 0.014317197787809174,
"acc_norm": 0.6203071672354948,
"acc_norm_stderr": 0.014182119866974872
},
"harness|hellaswag|10": {
"acc": 0.6452897829117705,
"acc_stderr": 0.004774476498238617,
"acc_norm": 0.8418641704839673,
"acc_norm_stderr": 0.003641226294167795
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237103,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237103
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.04017901275981749,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.04017901275981749
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.029647813539365245,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.029647813539365245
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843671,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843671
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.024552292209342668,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.024552292209342668
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6935483870967742,
"acc_stderr": 0.026226485652553887,
"acc_norm": 0.6935483870967742,
"acc_norm_stderr": 0.026226485652553887
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.035014387062967806,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.035014387062967806
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365907,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365907
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.02541634309630644,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.02541634309630644
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5794871794871795,
"acc_stderr": 0.02502861027671086,
"acc_norm": 0.5794871794871795,
"acc_norm_stderr": 0.02502861027671086
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131137,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131137
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6008403361344538,
"acc_stderr": 0.03181110032413926,
"acc_norm": 0.6008403361344538,
"acc_norm_stderr": 0.03181110032413926
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7743119266055046,
"acc_stderr": 0.017923087667803064,
"acc_norm": 0.7743119266055046,
"acc_norm_stderr": 0.017923087667803064
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608044,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565437,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.02765215314415927,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.02765215314415927
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.041184385658062976,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.041184385658062976
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260594,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260594
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8247863247863247,
"acc_stderr": 0.02490443909891824,
"acc_norm": 0.8247863247863247,
"acc_norm_stderr": 0.02490443909891824
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7790549169859514,
"acc_stderr": 0.014836205167333555,
"acc_norm": 0.7790549169859514,
"acc_norm_stderr": 0.014836205167333555
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.025816756791584197,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.025816756791584197
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5027932960893855,
"acc_stderr": 0.016722240595491714,
"acc_norm": 0.5027932960893855,
"acc_norm_stderr": 0.016722240595491714
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.027363593284684965,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.027363593284684965
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6366559485530546,
"acc_stderr": 0.02731684767419271,
"acc_norm": 0.6366559485530546,
"acc_norm_stderr": 0.02731684767419271
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6604938271604939,
"acc_stderr": 0.026348564412011628,
"acc_norm": 0.6604938271604939,
"acc_norm_stderr": 0.026348564412011628
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43741851368970014,
"acc_stderr": 0.012669813464935726,
"acc_norm": 0.43741851368970014,
"acc_norm_stderr": 0.012669813464935726
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5698529411764706,
"acc_stderr": 0.030074971917302875,
"acc_norm": 0.5698529411764706,
"acc_norm_stderr": 0.030074971917302875
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.019886221037501862,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.019886221037501862
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7661691542288557,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.7661691542288557,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3806609547123623,
"mc1_stderr": 0.016997627871907926,
"mc2": 0.5283543922925904,
"mc2_stderr": 0.015514015586882196
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Undi95__ReMM-v2.1-L2-13B | 2023-09-18T13:45:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Undi95/ReMM-v2.1-L2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/ReMM-v2.1-L2-13B](https://huggingface.co/Undi95/ReMM-v2.1-L2-13B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__ReMM-v2.1-L2-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T13:43:56.304128](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-v2.1-L2-13B/blob/main/results_2023-09-18T13-43-56.304128.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5614512577762221,\n\
\ \"acc_stderr\": 0.03441984146887643,\n \"acc_norm\": 0.5652166112214104,\n\
\ \"acc_norm_stderr\": 0.03439813719483044,\n \"mc1\": 0.3659730722154223,\n\
\ \"mc1_stderr\": 0.016862941684088376,\n \"mc2\": 0.502982198851368,\n\
\ \"mc2_stderr\": 0.015602709737779776\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5844709897610921,\n \"acc_stderr\": 0.01440136664121638,\n\
\ \"acc_norm\": 0.6143344709897611,\n \"acc_norm_stderr\": 0.01422425097325718\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6468830910177256,\n\
\ \"acc_stderr\": 0.004769618829196511,\n \"acc_norm\": 0.8391754630551683,\n\
\ \"acc_norm_stderr\": 0.0036661823284423437\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.030285009259009794,\n\
\ \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.030285009259009794\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\
\ \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n\
\ \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4808510638297872,\n \"acc_stderr\": 0.03266204299064678,\n\
\ \"acc_norm\": 0.4808510638297872,\n \"acc_norm_stderr\": 0.03266204299064678\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3386243386243386,\n \"acc_stderr\": 0.024373197867983063,\n \"\
acc_norm\": 0.3386243386243386,\n \"acc_norm_stderr\": 0.024373197867983063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.635483870967742,\n\
\ \"acc_stderr\": 0.027379871229943255,\n \"acc_norm\": 0.635483870967742,\n\
\ \"acc_norm_stderr\": 0.027379871229943255\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n\
\ \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.037131580674819135,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.037131580674819135\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"\
acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.029252823291803638,\n\
\ \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.029252823291803638\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5205128205128206,\n \"acc_stderr\": 0.02532966316348994,\n \
\ \"acc_norm\": 0.5205128205128206,\n \"acc_norm_stderr\": 0.02532966316348994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.032016501007396114,\n\
\ \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.032016501007396114\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7174311926605504,\n \"acc_stderr\": 0.01930424349770715,\n \"\
acc_norm\": 0.7174311926605504,\n \"acc_norm_stderr\": 0.01930424349770715\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n \"\
acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.02782078198114968,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.02782078198114968\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.04260735157644559,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.04260735157644559\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.039849796533028725,\n \"\
acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.039849796533028725\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503948,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503948\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.026453508054040332,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.026453508054040332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7662835249042146,\n\
\ \"acc_stderr\": 0.01513338327898883,\n \"acc_norm\": 0.7662835249042146,\n\
\ \"acc_norm_stderr\": 0.01513338327898883\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6329479768786127,\n \"acc_stderr\": 0.02595005433765407,\n\
\ \"acc_norm\": 0.6329479768786127,\n \"acc_norm_stderr\": 0.02595005433765407\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45027932960893857,\n\
\ \"acc_stderr\": 0.016639615236845803,\n \"acc_norm\": 0.45027932960893857,\n\
\ \"acc_norm_stderr\": 0.016639615236845803\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424523,\n\
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424523\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6430868167202572,\n\
\ \"acc_stderr\": 0.027210420375934023,\n \"acc_norm\": 0.6430868167202572,\n\
\ \"acc_norm_stderr\": 0.027210420375934023\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.027002521034516478,\n\
\ \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.027002521034516478\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4276401564537158,\n\
\ \"acc_stderr\": 0.012635799922765848,\n \"acc_norm\": 0.4276401564537158,\n\
\ \"acc_norm_stderr\": 0.012635799922765848\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904612,\n\
\ \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904612\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5833333333333334,\n \"acc_stderr\": 0.019944914136873586,\n \
\ \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.019944914136873586\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.03106721126287247,\n\
\ \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.03106721126287247\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.681592039800995,\n\
\ \"acc_stderr\": 0.03294118479054095,\n \"acc_norm\": 0.681592039800995,\n\
\ \"acc_norm_stderr\": 0.03294118479054095\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338733,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338733\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3659730722154223,\n\
\ \"mc1_stderr\": 0.016862941684088376,\n \"mc2\": 0.502982198851368,\n\
\ \"mc2_stderr\": 0.015602709737779776\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/ReMM-v2.1-L2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-43-56.304128.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-43-56.304128.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-43-56.304128.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-43-56.304128.parquet'
- config_name: results
data_files:
- split: 2023_09_18T13_43_56.304128
path:
- results_2023-09-18T13-43-56.304128.parquet
- split: latest
path:
- results_2023-09-18T13-43-56.304128.parquet
---
# Dataset Card for Evaluation run of Undi95/ReMM-v2.1-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/ReMM-v2.1-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/ReMM-v2.1-L2-13B](https://huggingface.co/Undi95/ReMM-v2.1-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__ReMM-v2.1-L2-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T13:43:56.304128](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-v2.1-L2-13B/blob/main/results_2023-09-18T13-43-56.304128.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5614512577762221,
"acc_stderr": 0.03441984146887643,
"acc_norm": 0.5652166112214104,
"acc_norm_stderr": 0.03439813719483044,
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088376,
"mc2": 0.502982198851368,
"mc2_stderr": 0.015602709737779776
},
"harness|arc:challenge|25": {
"acc": 0.5844709897610921,
"acc_stderr": 0.01440136664121638,
"acc_norm": 0.6143344709897611,
"acc_norm_stderr": 0.01422425097325718
},
"harness|hellaswag|10": {
"acc": 0.6468830910177256,
"acc_stderr": 0.004769618829196511,
"acc_norm": 0.8391754630551683,
"acc_norm_stderr": 0.0036661823284423437
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5886792452830188,
"acc_stderr": 0.030285009259009794,
"acc_norm": 0.5886792452830188,
"acc_norm_stderr": 0.030285009259009794
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.04101405519842426,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.04101405519842426
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4808510638297872,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.4808510638297872,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3386243386243386,
"acc_stderr": 0.024373197867983063,
"acc_norm": 0.3386243386243386,
"acc_norm_stderr": 0.024373197867983063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.041905964388711366,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.041905964388711366
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.635483870967742,
"acc_stderr": 0.027379871229943255,
"acc_norm": 0.635483870967742,
"acc_norm_stderr": 0.027379871229943255
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.037131580674819135,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.037131580674819135
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.029252823291803638,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.029252823291803638
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5205128205128206,
"acc_stderr": 0.02532966316348994,
"acc_norm": 0.5205128205128206,
"acc_norm_stderr": 0.02532966316348994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5840336134453782,
"acc_stderr": 0.032016501007396114,
"acc_norm": 0.5840336134453782,
"acc_norm_stderr": 0.032016501007396114
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7174311926605504,
"acc_stderr": 0.01930424349770715,
"acc_norm": 0.7174311926605504,
"acc_norm_stderr": 0.01930424349770715
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.41203703703703703,
"acc_stderr": 0.03356787758160835,
"acc_norm": 0.41203703703703703,
"acc_norm_stderr": 0.03356787758160835
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.02782078198114968,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.02782078198114968
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.04260735157644559,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.04260735157644559
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.039849796533028725,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.039849796533028725
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503948,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503948
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.026453508054040332,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.026453508054040332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7662835249042146,
"acc_stderr": 0.01513338327898883,
"acc_norm": 0.7662835249042146,
"acc_norm_stderr": 0.01513338327898883
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6329479768786127,
"acc_stderr": 0.02595005433765407,
"acc_norm": 0.6329479768786127,
"acc_norm_stderr": 0.02595005433765407
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45027932960893857,
"acc_stderr": 0.016639615236845803,
"acc_norm": 0.45027932960893857,
"acc_norm_stderr": 0.016639615236845803
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.027956046165424523,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.027956046165424523
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6430868167202572,
"acc_stderr": 0.027210420375934023,
"acc_norm": 0.6430868167202572,
"acc_norm_stderr": 0.027210420375934023
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.027002521034516478,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.027002521034516478
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4276401564537158,
"acc_stderr": 0.012635799922765848,
"acc_norm": 0.4276401564537158,
"acc_norm_stderr": 0.012635799922765848
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.019944914136873586,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.019944914136873586
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6204081632653061,
"acc_stderr": 0.03106721126287247,
"acc_norm": 0.6204081632653061,
"acc_norm_stderr": 0.03106721126287247
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.681592039800995,
"acc_stderr": 0.03294118479054095,
"acc_norm": 0.681592039800995,
"acc_norm_stderr": 0.03294118479054095
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338733,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338733
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088376,
"mc2": 0.502982198851368,
"mc2_stderr": 0.015602709737779776
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Undi95__UndiMix-v4-13B | 2023-09-18T13:47:17.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Undi95/UndiMix-v4-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/UndiMix-v4-13B](https://huggingface.co/Undi95/UndiMix-v4-13B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__UndiMix-v4-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T13:45:54.862257](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__UndiMix-v4-13B/blob/main/results_2023-09-18T13-45-54.862257.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5703924546400917,\n\
\ \"acc_stderr\": 0.03420615142613721,\n \"acc_norm\": 0.5744229523609673,\n\
\ \"acc_norm_stderr\": 0.03418308961008044,\n \"mc1\": 0.33659730722154224,\n\
\ \"mc1_stderr\": 0.01654241280949489,\n \"mc2\": 0.48955195668610224,\n\
\ \"mc2_stderr\": 0.015400278901450503\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.575938566552901,\n \"acc_stderr\": 0.014441889627464396,\n\
\ \"acc_norm\": 0.6194539249146758,\n \"acc_norm_stderr\": 0.014188277712349814\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6444931288587931,\n\
\ \"acc_stderr\": 0.0047768836327226165,\n \"acc_norm\": 0.8387771360286795,\n\
\ \"acc_norm_stderr\": 0.0036698484004877773\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490436,\n\
\ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490436\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6377358490566037,\n \"acc_stderr\": 0.0295822451283843,\n\
\ \"acc_norm\": 0.6377358490566037,\n \"acc_norm_stderr\": 0.0295822451283843\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\
\ \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n\
\ \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.0379401267469703,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.0379401267469703\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364396,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364396\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.0433913832257986,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.0433913832257986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3386243386243386,\n \"acc_stderr\": 0.02437319786798306,\n \"\
acc_norm\": 0.3386243386243386,\n \"acc_norm_stderr\": 0.02437319786798306\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n\
\ \"acc_stderr\": 0.026860206444724345,\n \"acc_norm\": 0.6645161290322581,\n\
\ \"acc_norm_stderr\": 0.026860206444724345\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.035025446508458714,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.035025446508458714\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.036462049632538115,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.036462049632538115\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533086,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533086\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.028112091210117478,\n\
\ \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.028112091210117478\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5307692307692308,\n \"acc_stderr\": 0.025302958890850154,\n\
\ \"acc_norm\": 0.5307692307692308,\n \"acc_norm_stderr\": 0.025302958890850154\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413926,\n \
\ \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413926\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.744954128440367,\n \"acc_stderr\": 0.01868850085653584,\n \"acc_norm\"\
: 0.744954128440367,\n \"acc_norm_stderr\": 0.01868850085653584\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.02977177522814563,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02977177522814563\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.030500283176545857,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.030500283176545857\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.03559039531617342,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.03559039531617342\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.02581923325648372,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.02581923325648372\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.768837803320562,\n\
\ \"acc_stderr\": 0.015075523238101074,\n \"acc_norm\": 0.768837803320562,\n\
\ \"acc_norm_stderr\": 0.015075523238101074\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6329479768786127,\n \"acc_stderr\": 0.025950054337654075,\n\
\ \"acc_norm\": 0.6329479768786127,\n \"acc_norm_stderr\": 0.025950054337654075\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.441340782122905,\n\
\ \"acc_stderr\": 0.016607021781050873,\n \"acc_norm\": 0.441340782122905,\n\
\ \"acc_norm_stderr\": 0.016607021781050873\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.027780141207023344,\n\
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.027780141207023344\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.02715520810320086,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.02715520810320086\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.026822801759507894,\n\
\ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.026822801759507894\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4315514993481095,\n\
\ \"acc_stderr\": 0.012650007999463878,\n \"acc_norm\": 0.4315514993481095,\n\
\ \"acc_norm_stderr\": 0.012650007999463878\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5330882352941176,\n \"acc_stderr\": 0.030306257722468307,\n\
\ \"acc_norm\": 0.5330882352941176,\n \"acc_norm_stderr\": 0.030306257722468307\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5882352941176471,\n \"acc_stderr\": 0.01991037746310594,\n \
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.01991037746310594\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.04653429807913507,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.04653429807913507\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n\
\ \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.03187187537919798,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.03187187537919798\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.03301405946987249,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.03301405946987249\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33659730722154224,\n\
\ \"mc1_stderr\": 0.01654241280949489,\n \"mc2\": 0.48955195668610224,\n\
\ \"mc2_stderr\": 0.015400278901450503\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/UndiMix-v4-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-45-54.862257.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-45-54.862257.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-45-54.862257.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-45-54.862257.parquet'
- config_name: results
data_files:
- split: 2023_09_18T13_45_54.862257
path:
- results_2023-09-18T13-45-54.862257.parquet
- split: latest
path:
- results_2023-09-18T13-45-54.862257.parquet
---
# Dataset Card for Evaluation run of Undi95/UndiMix-v4-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/UndiMix-v4-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/UndiMix-v4-13B](https://huggingface.co/Undi95/UndiMix-v4-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__UndiMix-v4-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T13:45:54.862257](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__UndiMix-v4-13B/blob/main/results_2023-09-18T13-45-54.862257.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5703924546400917,
"acc_stderr": 0.03420615142613721,
"acc_norm": 0.5744229523609673,
"acc_norm_stderr": 0.03418308961008044,
"mc1": 0.33659730722154224,
"mc1_stderr": 0.01654241280949489,
"mc2": 0.48955195668610224,
"mc2_stderr": 0.015400278901450503
},
"harness|arc:challenge|25": {
"acc": 0.575938566552901,
"acc_stderr": 0.014441889627464396,
"acc_norm": 0.6194539249146758,
"acc_norm_stderr": 0.014188277712349814
},
"harness|hellaswag|10": {
"acc": 0.6444931288587931,
"acc_stderr": 0.0047768836327226165,
"acc_norm": 0.8387771360286795,
"acc_norm_stderr": 0.0036698484004877773
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6377358490566037,
"acc_stderr": 0.0295822451283843,
"acc_norm": 0.6377358490566037,
"acc_norm_stderr": 0.0295822451283843
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.04101405519842426,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.04101405519842426
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.0379401267469703,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.0379401267469703
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364396,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364396
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.0433913832257986,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.0433913832257986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3386243386243386,
"acc_stderr": 0.02437319786798306,
"acc_norm": 0.3386243386243386,
"acc_norm_stderr": 0.02437319786798306
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.026860206444724345,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.026860206444724345
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.035025446508458714,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.035025446508458714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.036462049632538115,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.036462049632538115
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533086,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533086
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.028112091210117478,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.028112091210117478
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5307692307692308,
"acc_stderr": 0.025302958890850154,
"acc_norm": 0.5307692307692308,
"acc_norm_stderr": 0.025302958890850154
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652458,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652458
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6008403361344538,
"acc_stderr": 0.03181110032413926,
"acc_norm": 0.6008403361344538,
"acc_norm_stderr": 0.03181110032413926
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.744954128440367,
"acc_stderr": 0.01868850085653584,
"acc_norm": 0.744954128440367,
"acc_norm_stderr": 0.01868850085653584
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.02977177522814563,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.02977177522814563
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.030500283176545857,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.030500283176545857
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.02581923325648372,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.02581923325648372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.768837803320562,
"acc_stderr": 0.015075523238101074,
"acc_norm": 0.768837803320562,
"acc_norm_stderr": 0.015075523238101074
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6329479768786127,
"acc_stderr": 0.025950054337654075,
"acc_norm": 0.6329479768786127,
"acc_norm_stderr": 0.025950054337654075
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.441340782122905,
"acc_stderr": 0.016607021781050873,
"acc_norm": 0.441340782122905,
"acc_norm_stderr": 0.016607021781050873
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.027780141207023344,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.027780141207023344
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.02715520810320086,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.02715520810320086
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.026822801759507894,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.026822801759507894
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4315514993481095,
"acc_stderr": 0.012650007999463878,
"acc_norm": 0.4315514993481095,
"acc_norm_stderr": 0.012650007999463878
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5330882352941176,
"acc_stderr": 0.030306257722468307,
"acc_norm": 0.5330882352941176,
"acc_norm_stderr": 0.030306257722468307
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.01991037746310594,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.01991037746310594
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.04653429807913507,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.04653429807913507
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.03187187537919798,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.03187187537919798
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.03301405946987249,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.03301405946987249
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33659730722154224,
"mc1_stderr": 0.01654241280949489,
"mc2": 0.48955195668610224,
"mc2_stderr": 0.015400278901450503
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Undi95__OpenRP-13B | 2023-09-18T13:50:20.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Undi95/OpenRP-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/OpenRP-13B](https://huggingface.co/Undi95/OpenRP-13B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__OpenRP-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T13:48:59.614981](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__OpenRP-13B/blob/main/results_2023-09-18T13-48-59.614981.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5760926841629516,\n\
\ \"acc_stderr\": 0.03424711249500131,\n \"acc_norm\": 0.5800492228294355,\n\
\ \"acc_norm_stderr\": 0.03422585829718664,\n \"mc1\": 0.3378212974296206,\n\
\ \"mc1_stderr\": 0.016557167322516875,\n \"mc2\": 0.48286539138692774,\n\
\ \"mc2_stderr\": 0.015189076635393605\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.0143839153022254,\n\
\ \"acc_norm\": 0.621160409556314,\n \"acc_norm_stderr\": 0.014175915490000322\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6258713403704441,\n\
\ \"acc_stderr\": 0.004829081532826502,\n \"acc_norm\": 0.8260306711810397,\n\
\ \"acc_norm_stderr\": 0.0037830836739860575\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.04026097083296563,\n\
\ \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.04026097083296563\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.0498887651569859,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.0498887651569859\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.029773082713319875,\n\
\ \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.029773082713319875\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237101,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237101\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.037940126746970296,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.037940126746970296\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929777,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467381,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467381\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.024229965298425082,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.024229965298425082\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.042857142857142816,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.042857142857142816\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6838709677419355,\n\
\ \"acc_stderr\": 0.026450874489042774,\n \"acc_norm\": 0.6838709677419355,\n\
\ \"acc_norm_stderr\": 0.026450874489042774\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391244,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391244\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5205128205128206,\n \"acc_stderr\": 0.02532966316348994,\n \
\ \"acc_norm\": 0.5205128205128206,\n \"acc_norm_stderr\": 0.02532966316348994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6092436974789915,\n \"acc_stderr\": 0.03169380235712996,\n\
\ \"acc_norm\": 0.6092436974789915,\n \"acc_norm_stderr\": 0.03169380235712996\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7651376146788991,\n \"acc_stderr\": 0.01817511051034356,\n \"\
acc_norm\": 0.7651376146788991,\n \"acc_norm_stderr\": 0.01817511051034356\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608044,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608044\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.042258754519696365,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.042258754519696365\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.039849796533028725,\n \"\
acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.039849796533028725\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291518,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291518\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260594,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260594\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.782051282051282,\n\
\ \"acc_stderr\": 0.02704685763071669,\n \"acc_norm\": 0.782051282051282,\n\
\ \"acc_norm_stderr\": 0.02704685763071669\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7701149425287356,\n\
\ \"acc_stderr\": 0.015046301846691805,\n \"acc_norm\": 0.7701149425287356,\n\
\ \"acc_norm_stderr\": 0.015046301846691805\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.025722802200895803,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.025722802200895803\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.47262569832402235,\n\
\ \"acc_stderr\": 0.016697420650642752,\n \"acc_norm\": 0.47262569832402235,\n\
\ \"acc_norm_stderr\": 0.016697420650642752\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.027530078447110303,\n\
\ \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.027530078447110303\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.02646248777700187,\n\
\ \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.02646248777700187\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.02968010556502904,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.02968010556502904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42894393741851367,\n\
\ \"acc_stderr\": 0.012640625443067358,\n \"acc_norm\": 0.42894393741851367,\n\
\ \"acc_norm_stderr\": 0.012640625443067358\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5441176470588235,\n \"acc_stderr\": 0.030254372573976722,\n\
\ \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.030254372573976722\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.576797385620915,\n \"acc_stderr\": 0.019987809769482064,\n \
\ \"acc_norm\": 0.576797385620915,\n \"acc_norm_stderr\": 0.019987809769482064\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726496,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.029475250236017193,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.029475250236017193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3378212974296206,\n\
\ \"mc1_stderr\": 0.016557167322516875,\n \"mc2\": 0.48286539138692774,\n\
\ \"mc2_stderr\": 0.015189076635393605\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/OpenRP-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-48-59.614981.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-48-59.614981.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-48-59.614981.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-48-59.614981.parquet'
- config_name: results
data_files:
- split: 2023_09_18T13_48_59.614981
path:
- results_2023-09-18T13-48-59.614981.parquet
- split: latest
path:
- results_2023-09-18T13-48-59.614981.parquet
---
# Dataset Card for Evaluation run of Undi95/OpenRP-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/OpenRP-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/OpenRP-13B](https://huggingface.co/Undi95/OpenRP-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__OpenRP-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T13:48:59.614981](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__OpenRP-13B/blob/main/results_2023-09-18T13-48-59.614981.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5760926841629516,
"acc_stderr": 0.03424711249500131,
"acc_norm": 0.5800492228294355,
"acc_norm_stderr": 0.03422585829718664,
"mc1": 0.3378212974296206,
"mc1_stderr": 0.016557167322516875,
"mc2": 0.48286539138692774,
"mc2_stderr": 0.015189076635393605
},
"harness|arc:challenge|25": {
"acc": 0.5878839590443686,
"acc_stderr": 0.0143839153022254,
"acc_norm": 0.621160409556314,
"acc_norm_stderr": 0.014175915490000322
},
"harness|hellaswag|10": {
"acc": 0.6258713403704441,
"acc_stderr": 0.004829081532826502,
"acc_norm": 0.8260306711810397,
"acc_norm_stderr": 0.0037830836739860575
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5723684210526315,
"acc_stderr": 0.04026097083296563,
"acc_norm": 0.5723684210526315,
"acc_norm_stderr": 0.04026097083296563
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.56,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.029773082713319875,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.029773082713319875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.037940126746970296,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.037940126746970296
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929777,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467381,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467381
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.024229965298425082,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.024229965298425082
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.042857142857142816,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.042857142857142816
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6838709677419355,
"acc_stderr": 0.026450874489042774,
"acc_norm": 0.6838709677419355,
"acc_norm_stderr": 0.026450874489042774
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.03663974994391244,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.03663974994391244
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5205128205128206,
"acc_stderr": 0.02532966316348994,
"acc_norm": 0.5205128205128206,
"acc_norm_stderr": 0.02532966316348994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6092436974789915,
"acc_stderr": 0.03169380235712996,
"acc_norm": 0.6092436974789915,
"acc_norm_stderr": 0.03169380235712996
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7651376146788991,
"acc_stderr": 0.01817511051034356,
"acc_norm": 0.7651376146788991,
"acc_norm_stderr": 0.01817511051034356
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608044,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.042258754519696365,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.042258754519696365
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.039849796533028725,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.039849796533028725
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291518,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291518
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260594,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260594
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.782051282051282,
"acc_stderr": 0.02704685763071669,
"acc_norm": 0.782051282051282,
"acc_norm_stderr": 0.02704685763071669
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7701149425287356,
"acc_stderr": 0.015046301846691805,
"acc_norm": 0.7701149425287356,
"acc_norm_stderr": 0.015046301846691805
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.025722802200895803,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.025722802200895803
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.47262569832402235,
"acc_stderr": 0.016697420650642752,
"acc_norm": 0.47262569832402235,
"acc_norm_stderr": 0.016697420650642752
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.027530078447110303,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.027530078447110303
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.02646248777700187,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.02646248777700187
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.02968010556502904,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.02968010556502904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42894393741851367,
"acc_stderr": 0.012640625443067358,
"acc_norm": 0.42894393741851367,
"acc_norm_stderr": 0.012640625443067358
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.030254372573976722,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.030254372573976722
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.576797385620915,
"acc_stderr": 0.019987809769482064,
"acc_norm": 0.576797385620915,
"acc_norm_stderr": 0.019987809769482064
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017193,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017193
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3378212974296206,
"mc1_stderr": 0.016557167322516875,
"mc2": 0.48286539138692774,
"mc2_stderr": 0.015189076635393605
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_nicholasKluge__Aira-2-774M | 2023-09-18T13:50:49.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of nicholasKluge/Aira-2-774M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nicholasKluge/Aira-2-774M](https://huggingface.co/nicholasKluge/Aira-2-774M)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nicholasKluge__Aira-2-774M\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T13:49:35.718586](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-2-774M/blob/main/results_2023-09-18T13-49-35.718586.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25274737880984716,\n\
\ \"acc_stderr\": 0.03151696657060322,\n \"acc_norm\": 0.2543231605957627,\n\
\ \"acc_norm_stderr\": 0.031526516312477076,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931588,\n \"mc2\": 0.41328321305397103,\n\
\ \"mc2_stderr\": 0.015490034789933101\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.26109215017064846,\n \"acc_stderr\": 0.012835523909473848,\n\
\ \"acc_norm\": 0.28754266211604096,\n \"acc_norm_stderr\": 0.01322671905626613\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3414658434574786,\n\
\ \"acc_stderr\": 0.00473232217215375,\n \"acc_norm\": 0.40798645688109936,\n\
\ \"acc_norm_stderr\": 0.0049045617959189965\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3111111111111111,\n\
\ \"acc_stderr\": 0.03999262876617722,\n \"acc_norm\": 0.3111111111111111,\n\
\ \"acc_norm_stderr\": 0.03999262876617722\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.16447368421052633,\n \"acc_stderr\": 0.0301675334686327,\n\
\ \"acc_norm\": 0.16447368421052633,\n \"acc_norm_stderr\": 0.0301675334686327\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n\
\ \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.27,\n \
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2943396226415094,\n \"acc_stderr\": 0.028049186315695248,\n\
\ \"acc_norm\": 0.2943396226415094,\n \"acc_norm_stderr\": 0.028049186315695248\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036844,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036844\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.27,\n\
\ \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.34104046242774566,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.34104046242774566,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.30638297872340425,\n \"acc_stderr\": 0.030135906478517563,\n\
\ \"acc_norm\": 0.30638297872340425,\n \"acc_norm_stderr\": 0.030135906478517563\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.20175438596491227,\n\
\ \"acc_stderr\": 0.037752050135836386,\n \"acc_norm\": 0.20175438596491227,\n\
\ \"acc_norm_stderr\": 0.037752050135836386\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.1931216931216931,\n \"acc_stderr\": 0.020330538160035636,\n \"\
acc_norm\": 0.1931216931216931,\n \"acc_norm_stderr\": 0.020330538160035636\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\
\ \"acc_stderr\": 0.03567016675276862,\n \"acc_norm\": 0.1984126984126984,\n\
\ \"acc_norm_stderr\": 0.03567016675276862\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24516129032258063,\n\
\ \"acc_stderr\": 0.024472243840895528,\n \"acc_norm\": 0.24516129032258063,\n\
\ \"acc_norm_stderr\": 0.024472243840895528\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.1625615763546798,\n \"acc_stderr\": 0.025960300064605597,\n\
\ \"acc_norm\": 0.1625615763546798,\n \"acc_norm_stderr\": 0.025960300064605597\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.31313131313131315,\n \"acc_stderr\": 0.033042050878136525,\n \"\
acc_norm\": 0.31313131313131315,\n \"acc_norm_stderr\": 0.033042050878136525\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.26424870466321243,\n \"acc_stderr\": 0.03182155050916646,\n\
\ \"acc_norm\": 0.26424870466321243,\n \"acc_norm_stderr\": 0.03182155050916646\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23076923076923078,\n \"acc_stderr\": 0.021362027725222728,\n\
\ \"acc_norm\": 0.23076923076923078,\n \"acc_norm_stderr\": 0.021362027725222728\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.02665353159671548,\n\
\ \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.02665353159671548\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.035118075718047245,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.035118075718047245\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.26055045871559634,\n \"acc_stderr\": 0.018819182034850068,\n \"\
acc_norm\": 0.26055045871559634,\n \"acc_norm_stderr\": 0.018819182034850068\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.19907407407407407,\n \"acc_stderr\": 0.027232298462690225,\n \"\
acc_norm\": 0.19907407407407407,\n \"acc_norm_stderr\": 0.027232298462690225\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2696078431372549,\n \"acc_stderr\": 0.03114557065948678,\n \"\
acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.03114557065948678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2911392405063291,\n \"acc_stderr\": 0.029571601065753374,\n \
\ \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.029571601065753374\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.12556053811659193,\n\
\ \"acc_stderr\": 0.022238985469323756,\n \"acc_norm\": 0.12556053811659193,\n\
\ \"acc_norm_stderr\": 0.022238985469323756\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.03727673575596919,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.03727673575596919\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2644628099173554,\n \"acc_stderr\": 0.040261875275912046,\n \"\
acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.040261875275912046\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.294478527607362,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.294478527607362,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n\
\ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.28205128205128205,\n\
\ \"acc_stderr\": 0.02948036054954119,\n \"acc_norm\": 0.28205128205128205,\n\
\ \"acc_norm_stderr\": 0.02948036054954119\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.19923371647509577,\n\
\ \"acc_stderr\": 0.014283378044296413,\n \"acc_norm\": 0.19923371647509577,\n\
\ \"acc_norm_stderr\": 0.014283378044296413\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2658959537572254,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.2658959537572254,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23687150837988827,\n\
\ \"acc_stderr\": 0.014219570788103982,\n \"acc_norm\": 0.23687150837988827,\n\
\ \"acc_norm_stderr\": 0.014219570788103982\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.27124183006535946,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.27124183006535946,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3279742765273312,\n\
\ \"acc_stderr\": 0.0266644108869376,\n \"acc_norm\": 0.3279742765273312,\n\
\ \"acc_norm_stderr\": 0.0266644108869376\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.02324620264781975,\n\
\ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.02324620264781975\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.02612957252718085,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.02612957252718085\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.16544117647058823,\n \"acc_stderr\": 0.022571771025494767,\n\
\ \"acc_norm\": 0.16544117647058823,\n \"acc_norm_stderr\": 0.022571771025494767\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24019607843137256,\n \"acc_stderr\": 0.01728276069516743,\n \
\ \"acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.01728276069516743\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.19090909090909092,\n\
\ \"acc_stderr\": 0.03764425585984924,\n \"acc_norm\": 0.19090909090909092,\n\
\ \"acc_norm_stderr\": 0.03764425585984924\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.22040816326530613,\n \"acc_stderr\": 0.026537045312145287,\n\
\ \"acc_norm\": 0.22040816326530613,\n \"acc_norm_stderr\": 0.026537045312145287\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25903614457831325,\n\
\ \"acc_stderr\": 0.03410646614071857,\n \"acc_norm\": 0.25903614457831325,\n\
\ \"acc_norm_stderr\": 0.03410646614071857\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.0330140594698725,\n\
\ \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.0330140594698725\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931588,\n \"mc2\": 0.41328321305397103,\n\
\ \"mc2_stderr\": 0.015490034789933101\n }\n}\n```"
repo_url: https://huggingface.co/nicholasKluge/Aira-2-774M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-49-35.718586.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-49-35.718586.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-49-35.718586.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-49-35.718586.parquet'
- config_name: results
data_files:
- split: 2023_09_18T13_49_35.718586
path:
- results_2023-09-18T13-49-35.718586.parquet
- split: latest
path:
- results_2023-09-18T13-49-35.718586.parquet
---
# Dataset Card for Evaluation run of nicholasKluge/Aira-2-774M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/nicholasKluge/Aira-2-774M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [nicholasKluge/Aira-2-774M](https://huggingface.co/nicholasKluge/Aira-2-774M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nicholasKluge__Aira-2-774M",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T13:49:35.718586](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-2-774M/blob/main/results_2023-09-18T13-49-35.718586.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25274737880984716,
"acc_stderr": 0.03151696657060322,
"acc_norm": 0.2543231605957627,
"acc_norm_stderr": 0.031526516312477076,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931588,
"mc2": 0.41328321305397103,
"mc2_stderr": 0.015490034789933101
},
"harness|arc:challenge|25": {
"acc": 0.26109215017064846,
"acc_stderr": 0.012835523909473848,
"acc_norm": 0.28754266211604096,
"acc_norm_stderr": 0.01322671905626613
},
"harness|hellaswag|10": {
"acc": 0.3414658434574786,
"acc_stderr": 0.00473232217215375,
"acc_norm": 0.40798645688109936,
"acc_norm_stderr": 0.0049045617959189965
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.03999262876617722,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.03999262876617722
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.16447368421052633,
"acc_stderr": 0.0301675334686327,
"acc_norm": 0.16447368421052633,
"acc_norm_stderr": 0.0301675334686327
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2943396226415094,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.2943396226415094,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.34104046242774566,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.34104046242774566,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.30638297872340425,
"acc_stderr": 0.030135906478517563,
"acc_norm": 0.30638297872340425,
"acc_norm_stderr": 0.030135906478517563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.20175438596491227,
"acc_stderr": 0.037752050135836386,
"acc_norm": 0.20175438596491227,
"acc_norm_stderr": 0.037752050135836386
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.1931216931216931,
"acc_stderr": 0.020330538160035636,
"acc_norm": 0.1931216931216931,
"acc_norm_stderr": 0.020330538160035636
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276862,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276862
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24516129032258063,
"acc_stderr": 0.024472243840895528,
"acc_norm": 0.24516129032258063,
"acc_norm_stderr": 0.024472243840895528
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.1625615763546798,
"acc_stderr": 0.025960300064605597,
"acc_norm": 0.1625615763546798,
"acc_norm_stderr": 0.025960300064605597
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.31313131313131315,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.31313131313131315,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.26424870466321243,
"acc_stderr": 0.03182155050916646,
"acc_norm": 0.26424870466321243,
"acc_norm_stderr": 0.03182155050916646
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23076923076923078,
"acc_stderr": 0.021362027725222728,
"acc_norm": 0.23076923076923078,
"acc_norm_stderr": 0.021362027725222728
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.02665353159671548,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.02665353159671548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.035118075718047245,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.035118075718047245
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26055045871559634,
"acc_stderr": 0.018819182034850068,
"acc_norm": 0.26055045871559634,
"acc_norm_stderr": 0.018819182034850068
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.19907407407407407,
"acc_stderr": 0.027232298462690225,
"acc_norm": 0.19907407407407407,
"acc_norm_stderr": 0.027232298462690225
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.03114557065948678,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.03114557065948678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2911392405063291,
"acc_stderr": 0.029571601065753374,
"acc_norm": 0.2911392405063291,
"acc_norm_stderr": 0.029571601065753374
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.12556053811659193,
"acc_stderr": 0.022238985469323756,
"acc_norm": 0.12556053811659193,
"acc_norm_stderr": 0.022238985469323756
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.03727673575596919,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.03727673575596919
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.040261875275912046,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.040261875275912046
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.294478527607362,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.294478527607362,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340455,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340455
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.28205128205128205,
"acc_stderr": 0.02948036054954119,
"acc_norm": 0.28205128205128205,
"acc_norm_stderr": 0.02948036054954119
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.19923371647509577,
"acc_stderr": 0.014283378044296413,
"acc_norm": 0.19923371647509577,
"acc_norm_stderr": 0.014283378044296413
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23687150837988827,
"acc_stderr": 0.014219570788103982,
"acc_norm": 0.23687150837988827,
"acc_norm_stderr": 0.014219570788103982
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27124183006535946,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.27124183006535946,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3279742765273312,
"acc_stderr": 0.0266644108869376,
"acc_norm": 0.3279742765273312,
"acc_norm_stderr": 0.0266644108869376
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.02324620264781975,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.02324620264781975
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.02612957252718085,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.02612957252718085
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16544117647058823,
"acc_stderr": 0.022571771025494767,
"acc_norm": 0.16544117647058823,
"acc_norm_stderr": 0.022571771025494767
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.01728276069516743,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.01728276069516743
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.19090909090909092,
"acc_stderr": 0.03764425585984924,
"acc_norm": 0.19090909090909092,
"acc_norm_stderr": 0.03764425585984924
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22040816326530613,
"acc_stderr": 0.026537045312145287,
"acc_norm": 0.22040816326530613,
"acc_norm_stderr": 0.026537045312145287
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25903614457831325,
"acc_stderr": 0.03410646614071857,
"acc_norm": 0.25903614457831325,
"acc_norm_stderr": 0.03410646614071857
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.0330140594698725,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.0330140594698725
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931588,
"mc2": 0.41328321305397103,
"mc2_stderr": 0.015490034789933101
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v37_SFT-R1-DPO-R2-7B | 2023-09-18T13:53:35.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PeanutJar/LLaMa-2-PeanutButter_v37_SFT-R1-DPO-R2-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PeanutJar/LLaMa-2-PeanutButter_v37_SFT-R1-DPO-R2-7B](https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v37_SFT-R1-DPO-R2-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v37_SFT-R1-DPO-R2-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T13:52:12.512549](https://huggingface.co/datasets/open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v37_SFT-R1-DPO-R2-7B/blob/main/results_2023-09-18T13-52-12.512549.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4757356941246475,\n\
\ \"acc_stderr\": 0.03534759162401654,\n \"acc_norm\": 0.47972138667961756,\n\
\ \"acc_norm_stderr\": 0.03533243222459744,\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.015702107090627904,\n \"mc2\": 0.41999112300299424,\n\
\ \"mc2_stderr\": 0.014077295047564501\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5042662116040956,\n \"acc_stderr\": 0.014610858923956955,\n\
\ \"acc_norm\": 0.5409556313993175,\n \"acc_norm_stderr\": 0.014562291073601233\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5925114519020116,\n\
\ \"acc_stderr\": 0.004903628887264536,\n \"acc_norm\": 0.7909778928500298,\n\
\ \"acc_norm_stderr\": 0.004057792171893564\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.04017901275981748,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.04017901275981748\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4716981132075472,\n \"acc_stderr\": 0.0307235352490061,\n\
\ \"acc_norm\": 0.4716981132075472,\n \"acc_norm_stderr\": 0.0307235352490061\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.4652777777777778,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4393063583815029,\n\
\ \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.4393063583815029,\n\
\ \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2830687830687831,\n \"acc_stderr\": 0.023201392938194974,\n \"\
acc_norm\": 0.2830687830687831,\n \"acc_norm_stderr\": 0.023201392938194974\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.49032258064516127,\n\
\ \"acc_stderr\": 0.028438677998909558,\n \"acc_norm\": 0.49032258064516127,\n\
\ \"acc_norm_stderr\": 0.028438677998909558\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561953,\n\
\ \"acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561953\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.03851716319398393,\n\
\ \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.03851716319398393\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.51010101010101,\n \"acc_stderr\": 0.035616254886737454,\n \"acc_norm\"\
: 0.51010101010101,\n \"acc_norm_stderr\": 0.035616254886737454\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.6735751295336787,\n \"acc_stderr\": 0.033840286211432945,\n\
\ \"acc_norm\": 0.6735751295336787,\n \"acc_norm_stderr\": 0.033840286211432945\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4512820512820513,\n \"acc_stderr\": 0.025230381238934833,\n\
\ \"acc_norm\": 0.4512820512820513,\n \"acc_norm_stderr\": 0.025230381238934833\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871923,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871923\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.0322529423239964,\n \
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.636697247706422,\n \"acc_stderr\": 0.020620603919625804,\n \"\
acc_norm\": 0.636697247706422,\n \"acc_norm_stderr\": 0.020620603919625804\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.28703703703703703,\n \"acc_stderr\": 0.030851992993257013,\n \"\
acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.030851992993257013\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.553921568627451,\n \"acc_stderr\": 0.03488845451304974,\n \"acc_norm\"\
: 0.553921568627451,\n \"acc_norm_stderr\": 0.03488845451304974\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.6497890295358649,\n \"acc_stderr\": 0.031052391937584346,\n \"\
acc_norm\": 0.6497890295358649,\n \"acc_norm_stderr\": 0.031052391937584346\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5560538116591929,\n\
\ \"acc_stderr\": 0.03334625674242728,\n \"acc_norm\": 0.5560538116591929,\n\
\ \"acc_norm_stderr\": 0.03334625674242728\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.549618320610687,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.549618320610687,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5398773006134969,\n \"acc_stderr\": 0.039158572914369714,\n\
\ \"acc_norm\": 0.5398773006134969,\n \"acc_norm_stderr\": 0.039158572914369714\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5728155339805825,\n \"acc_stderr\": 0.048979577377811674,\n\
\ \"acc_norm\": 0.5728155339805825,\n \"acc_norm_stderr\": 0.048979577377811674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6965811965811965,\n\
\ \"acc_stderr\": 0.030118210106942638,\n \"acc_norm\": 0.6965811965811965,\n\
\ \"acc_norm_stderr\": 0.030118210106942638\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6398467432950191,\n\
\ \"acc_stderr\": 0.017166362471369302,\n \"acc_norm\": 0.6398467432950191,\n\
\ \"acc_norm_stderr\": 0.017166362471369302\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.02690784985628254,\n\
\ \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.02690784985628254\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5032679738562091,\n \"acc_stderr\": 0.028629305194003543,\n\
\ \"acc_norm\": 0.5032679738562091,\n \"acc_norm_stderr\": 0.028629305194003543\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n\
\ \"acc_stderr\": 0.027604689028581996,\n \"acc_norm\": 0.617363344051447,\n\
\ \"acc_norm_stderr\": 0.027604689028581996\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4783950617283951,\n \"acc_stderr\": 0.02779476010500874,\n\
\ \"acc_norm\": 0.4783950617283951,\n \"acc_norm_stderr\": 0.02779476010500874\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3617021276595745,\n \"acc_stderr\": 0.02866382014719949,\n \
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.02866382014719949\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.36114732724902215,\n\
\ \"acc_stderr\": 0.01226793547751903,\n \"acc_norm\": 0.36114732724902215,\n\
\ \"acc_norm_stderr\": 0.01226793547751903\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.45098039215686275,\n \"acc_stderr\": 0.020130388312904528,\n \
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.020130388312904528\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5142857142857142,\n \"acc_stderr\": 0.03199615232806287,\n\
\ \"acc_norm\": 0.5142857142857142,\n \"acc_norm_stderr\": 0.03199615232806287\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6417910447761194,\n\
\ \"acc_stderr\": 0.03390393042268814,\n \"acc_norm\": 0.6417910447761194,\n\
\ \"acc_norm_stderr\": 0.03390393042268814\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.015702107090627904,\n \"mc2\": 0.41999112300299424,\n\
\ \"mc2_stderr\": 0.014077295047564501\n }\n}\n```"
repo_url: https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v37_SFT-R1-DPO-R2-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-52-12.512549.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-52-12.512549.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-52-12.512549.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-52-12.512549.parquet'
- config_name: results
data_files:
- split: 2023_09_18T13_52_12.512549
path:
- results_2023-09-18T13-52-12.512549.parquet
- split: latest
path:
- results_2023-09-18T13-52-12.512549.parquet
---
# Dataset Card for Evaluation run of PeanutJar/LLaMa-2-PeanutButter_v37_SFT-R1-DPO-R2-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v37_SFT-R1-DPO-R2-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PeanutJar/LLaMa-2-PeanutButter_v37_SFT-R1-DPO-R2-7B](https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v37_SFT-R1-DPO-R2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v37_SFT-R1-DPO-R2-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T13:52:12.512549](https://huggingface.co/datasets/open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v37_SFT-R1-DPO-R2-7B/blob/main/results_2023-09-18T13-52-12.512549.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4757356941246475,
"acc_stderr": 0.03534759162401654,
"acc_norm": 0.47972138667961756,
"acc_norm_stderr": 0.03533243222459744,
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627904,
"mc2": 0.41999112300299424,
"mc2_stderr": 0.014077295047564501
},
"harness|arc:challenge|25": {
"acc": 0.5042662116040956,
"acc_stderr": 0.014610858923956955,
"acc_norm": 0.5409556313993175,
"acc_norm_stderr": 0.014562291073601233
},
"harness|hellaswag|10": {
"acc": 0.5925114519020116,
"acc_stderr": 0.004903628887264536,
"acc_norm": 0.7909778928500298,
"acc_norm_stderr": 0.004057792171893564
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.04017901275981748,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.04017901275981748
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4716981132075472,
"acc_stderr": 0.0307235352490061,
"acc_norm": 0.4716981132075472,
"acc_norm_stderr": 0.0307235352490061
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4652777777777778,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.4652777777777778,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4393063583815029,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.4393063583815029,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2830687830687831,
"acc_stderr": 0.023201392938194974,
"acc_norm": 0.2830687830687831,
"acc_norm_stderr": 0.023201392938194974
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.49032258064516127,
"acc_stderr": 0.028438677998909558,
"acc_norm": 0.49032258064516127,
"acc_norm_stderr": 0.028438677998909558
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.37438423645320196,
"acc_stderr": 0.03405155380561953,
"acc_norm": 0.37438423645320196,
"acc_norm_stderr": 0.03405155380561953
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.03851716319398393,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.03851716319398393
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.51010101010101,
"acc_stderr": 0.035616254886737454,
"acc_norm": 0.51010101010101,
"acc_norm_stderr": 0.035616254886737454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6735751295336787,
"acc_stderr": 0.033840286211432945,
"acc_norm": 0.6735751295336787,
"acc_norm_stderr": 0.033840286211432945
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4512820512820513,
"acc_stderr": 0.025230381238934833,
"acc_norm": 0.4512820512820513,
"acc_norm_stderr": 0.025230381238934833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871923,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871923
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.636697247706422,
"acc_stderr": 0.020620603919625804,
"acc_norm": 0.636697247706422,
"acc_norm_stderr": 0.020620603919625804
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.030851992993257013,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.030851992993257013
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.553921568627451,
"acc_stderr": 0.03488845451304974,
"acc_norm": 0.553921568627451,
"acc_norm_stderr": 0.03488845451304974
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6497890295358649,
"acc_stderr": 0.031052391937584346,
"acc_norm": 0.6497890295358649,
"acc_norm_stderr": 0.031052391937584346
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5560538116591929,
"acc_stderr": 0.03334625674242728,
"acc_norm": 0.5560538116591929,
"acc_norm_stderr": 0.03334625674242728
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.549618320610687,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.549618320610687,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5398773006134969,
"acc_stderr": 0.039158572914369714,
"acc_norm": 0.5398773006134969,
"acc_norm_stderr": 0.039158572914369714
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.5728155339805825,
"acc_stderr": 0.048979577377811674,
"acc_norm": 0.5728155339805825,
"acc_norm_stderr": 0.048979577377811674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6965811965811965,
"acc_stderr": 0.030118210106942638,
"acc_norm": 0.6965811965811965,
"acc_norm_stderr": 0.030118210106942638
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6398467432950191,
"acc_stderr": 0.017166362471369302,
"acc_norm": 0.6398467432950191,
"acc_norm_stderr": 0.017166362471369302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.02690784985628254,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.02690784985628254
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5032679738562091,
"acc_stderr": 0.028629305194003543,
"acc_norm": 0.5032679738562091,
"acc_norm_stderr": 0.028629305194003543
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.617363344051447,
"acc_stderr": 0.027604689028581996,
"acc_norm": 0.617363344051447,
"acc_norm_stderr": 0.027604689028581996
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4783950617283951,
"acc_stderr": 0.02779476010500874,
"acc_norm": 0.4783950617283951,
"acc_norm_stderr": 0.02779476010500874
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.02866382014719949,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.02866382014719949
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.36114732724902215,
"acc_stderr": 0.01226793547751903,
"acc_norm": 0.36114732724902215,
"acc_norm_stderr": 0.01226793547751903
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.020130388312904528,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.020130388312904528
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5142857142857142,
"acc_stderr": 0.03199615232806287,
"acc_norm": 0.5142857142857142,
"acc_norm_stderr": 0.03199615232806287
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6417910447761194,
"acc_stderr": 0.03390393042268814,
"acc_norm": 0.6417910447761194,
"acc_norm_stderr": 0.03390393042268814
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627904,
"mc2": 0.41999112300299424,
"mc2_stderr": 0.014077295047564501
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Undi95__Unholy-v1-12L-13B | 2023-09-18T13:53:39.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Undi95/Unholy-v1-12L-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/Unholy-v1-12L-13B](https://huggingface.co/Undi95/Unholy-v1-12L-13B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__Unholy-v1-12L-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T13:52:19.375562](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Unholy-v1-12L-13B/blob/main/results_2023-09-18T13-52-19.375562.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5823767213037238,\n\
\ \"acc_stderr\": 0.03403833440142264,\n \"acc_norm\": 0.5860936635102556,\n\
\ \"acc_norm_stderr\": 0.034016793988093735,\n \"mc1\": 0.36964504283965727,\n\
\ \"mc1_stderr\": 0.01689818070697389,\n \"mc2\": 0.5109377575595978,\n\
\ \"mc2_stderr\": 0.015388241246569968\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6143344709897611,\n \"acc_stderr\": 0.01422425097325718,\n\
\ \"acc_norm\": 0.6356655290102389,\n \"acc_norm_stderr\": 0.014063260279882417\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6395140410276837,\n\
\ \"acc_stderr\": 0.004791601975612764,\n \"acc_norm\": 0.8374825731925911,\n\
\ \"acc_norm_stderr\": 0.003681708282581456\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.029773082713319878,\n\
\ \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.029773082713319878\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.0379401267469703,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.0379401267469703\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.023973861998992072,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.023973861998992072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.042857142857142816,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.042857142857142816\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n\
\ \"acc_stderr\": 0.02672949906834996,\n \"acc_norm\": 0.6709677419354839,\n\
\ \"acc_norm_stderr\": 0.02672949906834996\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.035679697722680495,\n\
\ \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.035679697722680495\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.031911782267135466,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.031911782267135466\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5564102564102564,\n \"acc_stderr\": 0.0251891498947642,\n \
\ \"acc_norm\": 0.5564102564102564,\n \"acc_norm_stderr\": 0.0251891498947642\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253252,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253252\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7559633027522936,\n \"acc_stderr\": 0.018415286351416406,\n \"\
acc_norm\": 0.7559633027522936,\n \"acc_norm_stderr\": 0.018415286351416406\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906942,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906942\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.03642914578292406,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.03642914578292406\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326468,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326468\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.025819233256483724,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.025819233256483724\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7701149425287356,\n\
\ \"acc_stderr\": 0.015046301846691805,\n \"acc_norm\": 0.7701149425287356,\n\
\ \"acc_norm_stderr\": 0.015046301846691805\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.025722802200895803,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.025722802200895803\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48156424581005586,\n\
\ \"acc_stderr\": 0.01671113049778282,\n \"acc_norm\": 0.48156424581005586,\n\
\ \"acc_norm_stderr\": 0.01671113049778282\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.027582811415159614,\n\
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.027582811415159614\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.639871382636656,\n\
\ \"acc_stderr\": 0.02726429759980401,\n \"acc_norm\": 0.639871382636656,\n\
\ \"acc_norm_stderr\": 0.02726429759980401\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.026041766202717163,\n\
\ \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.026041766202717163\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594106,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594106\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44654498044328556,\n\
\ \"acc_stderr\": 0.012697046024399685,\n \"acc_norm\": 0.44654498044328556,\n\
\ \"acc_norm_stderr\": 0.012697046024399685\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.03016191193076711,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.03016191193076711\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5800653594771242,\n \"acc_stderr\": 0.019966811178256477,\n \
\ \"acc_norm\": 0.5800653594771242,\n \"acc_norm_stderr\": 0.019966811178256477\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.029923100563683913,\n\
\ \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.029923100563683913\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36964504283965727,\n\
\ \"mc1_stderr\": 0.01689818070697389,\n \"mc2\": 0.5109377575595978,\n\
\ \"mc2_stderr\": 0.015388241246569968\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/Unholy-v1-12L-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-52-19.375562.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-52-19.375562.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-52-19.375562.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-52-19.375562.parquet'
- config_name: results
data_files:
- split: 2023_09_18T13_52_19.375562
path:
- results_2023-09-18T13-52-19.375562.parquet
- split: latest
path:
- results_2023-09-18T13-52-19.375562.parquet
---
# Dataset Card for Evaluation run of Undi95/Unholy-v1-12L-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/Unholy-v1-12L-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/Unholy-v1-12L-13B](https://huggingface.co/Undi95/Unholy-v1-12L-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__Unholy-v1-12L-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T13:52:19.375562](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Unholy-v1-12L-13B/blob/main/results_2023-09-18T13-52-19.375562.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5823767213037238,
"acc_stderr": 0.03403833440142264,
"acc_norm": 0.5860936635102556,
"acc_norm_stderr": 0.034016793988093735,
"mc1": 0.36964504283965727,
"mc1_stderr": 0.01689818070697389,
"mc2": 0.5109377575595978,
"mc2_stderr": 0.015388241246569968
},
"harness|arc:challenge|25": {
"acc": 0.6143344709897611,
"acc_stderr": 0.01422425097325718,
"acc_norm": 0.6356655290102389,
"acc_norm_stderr": 0.014063260279882417
},
"harness|hellaswag|10": {
"acc": 0.6395140410276837,
"acc_stderr": 0.004791601975612764,
"acc_norm": 0.8374825731925911,
"acc_norm_stderr": 0.003681708282581456
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.029773082713319878,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.029773082713319878
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.0379401267469703,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.0379401267469703
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179328,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179328
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.023973861998992072,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.023973861998992072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.042857142857142816,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.042857142857142816
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.02672949906834996,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.02672949906834996
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.035679697722680495,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.035679697722680495
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.031911782267135466,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.031911782267135466
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5564102564102564,
"acc_stderr": 0.0251891498947642,
"acc_norm": 0.5564102564102564,
"acc_norm_stderr": 0.0251891498947642
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253252,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253252
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7559633027522936,
"acc_stderr": 0.018415286351416406,
"acc_norm": 0.7559633027522936,
"acc_norm_stderr": 0.018415286351416406
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906942,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906942
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.03642914578292406,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.03642914578292406
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326468,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326468
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.025819233256483724,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.025819233256483724
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7701149425287356,
"acc_stderr": 0.015046301846691805,
"acc_norm": 0.7701149425287356,
"acc_norm_stderr": 0.015046301846691805
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.025722802200895803,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.025722802200895803
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48156424581005586,
"acc_stderr": 0.01671113049778282,
"acc_norm": 0.48156424581005586,
"acc_norm_stderr": 0.01671113049778282
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.027582811415159614,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.027582811415159614
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.639871382636656,
"acc_stderr": 0.02726429759980401,
"acc_norm": 0.639871382636656,
"acc_norm_stderr": 0.02726429759980401
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.026041766202717163,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.026041766202717163
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.029609912075594106,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.029609912075594106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44654498044328556,
"acc_stderr": 0.012697046024399685,
"acc_norm": 0.44654498044328556,
"acc_norm_stderr": 0.012697046024399685
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.03016191193076711,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.03016191193076711
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5800653594771242,
"acc_stderr": 0.019966811178256477,
"acc_norm": 0.5800653594771242,
"acc_norm_stderr": 0.019966811178256477
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.029923100563683913,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.029923100563683913
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36964504283965727,
"mc1_stderr": 0.01689818070697389,
"mc2": 0.5109377575595978,
"mc2_stderr": 0.015388241246569968
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/yamada_ryo_bocchitherock | 2023-09-18T13:56:26.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Yamada Ryō
This is the dataset of Yamada Ryō, containing 282 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 282 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 631 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 282 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 282 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 282 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 282 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 282 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 631 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 631 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 631 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
TeraTTS/stress_dataset_sft_proza | 2023-10-03T08:29:04.000Z | [
"license:mit",
"region:us"
] | TeraTTS | null | null | null | 1 | 0 | ---
license: mit
---
|
aminoss/canserbero | 2023-09-18T13:54:53.000Z | [
"license:openrail",
"region:us"
] | aminoss | null | null | null | 0 | 0 | ---
license: openrail
---
|
open-llm-leaderboard/details_Undi95__MLewdBoros-L2-13B | 2023-09-18T13:58:01.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Undi95/MLewdBoros-L2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/MLewdBoros-L2-13B](https://huggingface.co/Undi95/MLewdBoros-L2-13B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__MLewdBoros-L2-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T13:56:38.282478](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewdBoros-L2-13B/blob/main/results_2023-09-18T13-56-38.282478.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5673804756955018,\n\
\ \"acc_stderr\": 0.03428328895486213,\n \"acc_norm\": 0.5712986926383822,\n\
\ \"acc_norm_stderr\": 0.03426076119801903,\n \"mc1\": 0.3378212974296206,\n\
\ \"mc1_stderr\": 0.01655716732251688,\n \"mc2\": 0.48136107027773045,\n\
\ \"mc2_stderr\": 0.015082983111012829\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5947098976109215,\n \"acc_stderr\": 0.014346869060229327,\n\
\ \"acc_norm\": 0.6254266211604096,\n \"acc_norm_stderr\": 0.014144193471893454\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6385182234614618,\n\
\ \"acc_stderr\": 0.004794478426382608,\n \"acc_norm\": 0.8389762995419239,\n\
\ \"acc_norm_stderr\": 0.003668016360975837\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874142,\n\
\ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874142\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.029773082713319875,\n\
\ \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.029773082713319875\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.32275132275132273,\n \"acc_stderr\": 0.024078943243597016,\n \"\
acc_norm\": 0.32275132275132273,\n \"acc_norm_stderr\": 0.024078943243597016\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.667741935483871,\n\
\ \"acc_stderr\": 0.026795560848122804,\n \"acc_norm\": 0.667741935483871,\n\
\ \"acc_norm_stderr\": 0.026795560848122804\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.034991131376767445,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.034991131376767445\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.0364620496325381,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.0364620496325381\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713545,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713545\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.028112091210117467,\n\
\ \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.028112091210117467\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5153846153846153,\n \"acc_stderr\": 0.025339003010106515,\n\
\ \"acc_norm\": 0.5153846153846153,\n \"acc_norm_stderr\": 0.025339003010106515\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.03196876989195778,\n \
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.03196876989195778\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7376146788990826,\n\
\ \"acc_stderr\": 0.018861885021534734,\n \"acc_norm\": 0.7376146788990826,\n\
\ \"acc_norm_stderr\": 0.018861885021534734\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n\
\ \"acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035303,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035303\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302873,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302873\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.044328040552915185,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.044328040552915185\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.025598193686652258,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.025598193686652258\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7611749680715197,\n\
\ \"acc_stderr\": 0.015246803197398675,\n \"acc_norm\": 0.7611749680715197,\n\
\ \"acc_norm_stderr\": 0.015246803197398675\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.638728323699422,\n \"acc_stderr\": 0.025862201852277902,\n\
\ \"acc_norm\": 0.638728323699422,\n \"acc_norm_stderr\": 0.025862201852277902\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43687150837988825,\n\
\ \"acc_stderr\": 0.01658868086453062,\n \"acc_norm\": 0.43687150837988825,\n\
\ \"acc_norm_stderr\": 0.01658868086453062\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.027780141207023344,\n\
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.027780141207023344\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n\
\ \"acc_stderr\": 0.027098652621301754,\n \"acc_norm\": 0.6495176848874598,\n\
\ \"acc_norm_stderr\": 0.027098652621301754\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.02682280175950789,\n\
\ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.02682280175950789\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.029525914302558555,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.029525914302558555\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4276401564537158,\n\
\ \"acc_stderr\": 0.012635799922765844,\n \"acc_norm\": 0.4276401564537158,\n\
\ \"acc_norm_stderr\": 0.012635799922765844\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.03018753206032938,\n\
\ \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.03018753206032938\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5751633986928104,\n \"acc_stderr\": 0.01999797303545833,\n \
\ \"acc_norm\": 0.5751633986928104,\n \"acc_norm_stderr\": 0.01999797303545833\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.03071356045510849,\n\
\ \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.03071356045510849\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3378212974296206,\n\
\ \"mc1_stderr\": 0.01655716732251688,\n \"mc2\": 0.48136107027773045,\n\
\ \"mc2_stderr\": 0.015082983111012829\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/MLewdBoros-L2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-56-38.282478.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-56-38.282478.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-56-38.282478.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-56-38.282478.parquet'
- config_name: results
data_files:
- split: 2023_09_18T13_56_38.282478
path:
- results_2023-09-18T13-56-38.282478.parquet
- split: latest
path:
- results_2023-09-18T13-56-38.282478.parquet
---
# Dataset Card for Evaluation run of Undi95/MLewdBoros-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/MLewdBoros-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/MLewdBoros-L2-13B](https://huggingface.co/Undi95/MLewdBoros-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__MLewdBoros-L2-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T13:56:38.282478](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewdBoros-L2-13B/blob/main/results_2023-09-18T13-56-38.282478.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5673804756955018,
"acc_stderr": 0.03428328895486213,
"acc_norm": 0.5712986926383822,
"acc_norm_stderr": 0.03426076119801903,
"mc1": 0.3378212974296206,
"mc1_stderr": 0.01655716732251688,
"mc2": 0.48136107027773045,
"mc2_stderr": 0.015082983111012829
},
"harness|arc:challenge|25": {
"acc": 0.5947098976109215,
"acc_stderr": 0.014346869060229327,
"acc_norm": 0.6254266211604096,
"acc_norm_stderr": 0.014144193471893454
},
"harness|hellaswag|10": {
"acc": 0.6385182234614618,
"acc_stderr": 0.004794478426382608,
"acc_norm": 0.8389762995419239,
"acc_norm_stderr": 0.003668016360975837
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.029773082713319875,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.029773082713319875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.32275132275132273,
"acc_stderr": 0.024078943243597016,
"acc_norm": 0.32275132275132273,
"acc_norm_stderr": 0.024078943243597016
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.667741935483871,
"acc_stderr": 0.026795560848122804,
"acc_norm": 0.667741935483871,
"acc_norm_stderr": 0.026795560848122804
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.034991131376767445,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.034991131376767445
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.0364620496325381,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.0364620496325381
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03191178226713545,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03191178226713545
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.028112091210117467,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.028112091210117467
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5153846153846153,
"acc_stderr": 0.025339003010106515,
"acc_norm": 0.5153846153846153,
"acc_norm_stderr": 0.025339003010106515
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.03196876989195778,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.03196876989195778
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7376146788990826,
"acc_stderr": 0.018861885021534734,
"acc_norm": 0.7376146788990826,
"acc_norm_stderr": 0.018861885021534734
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.41203703703703703,
"acc_stderr": 0.03356787758160835,
"acc_norm": 0.41203703703703703,
"acc_norm_stderr": 0.03356787758160835
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.028304657943035303,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.028304657943035303
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302873,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302873
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.044328040552915185,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.044328040552915185
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.025598193686652258,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.025598193686652258
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7611749680715197,
"acc_stderr": 0.015246803197398675,
"acc_norm": 0.7611749680715197,
"acc_norm_stderr": 0.015246803197398675
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.638728323699422,
"acc_stderr": 0.025862201852277902,
"acc_norm": 0.638728323699422,
"acc_norm_stderr": 0.025862201852277902
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43687150837988825,
"acc_stderr": 0.01658868086453062,
"acc_norm": 0.43687150837988825,
"acc_norm_stderr": 0.01658868086453062
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.027780141207023344,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.027780141207023344
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.027098652621301754,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.027098652621301754
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.02682280175950789,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.02682280175950789
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.029525914302558555,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.029525914302558555
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4276401564537158,
"acc_stderr": 0.012635799922765844,
"acc_norm": 0.4276401564537158,
"acc_norm_stderr": 0.012635799922765844
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5551470588235294,
"acc_stderr": 0.03018753206032938,
"acc_norm": 0.5551470588235294,
"acc_norm_stderr": 0.03018753206032938
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5751633986928104,
"acc_stderr": 0.01999797303545833,
"acc_norm": 0.5751633986928104,
"acc_norm_stderr": 0.01999797303545833
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.03071356045510849,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.03071356045510849
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3378212974296206,
"mc1_stderr": 0.01655716732251688,
"mc2": 0.48136107027773045,
"mc2_stderr": 0.015082983111012829
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
aminoss/can | 2023-09-18T14:00:04.000Z | [
"license:openrail",
"region:us"
] | aminoss | null | null | null | 0 | 0 | ---
license: openrail
---
|
richardogundele/dialogue | 2023-09-18T13:59:02.000Z | [
"region:us"
] | richardogundele | null | null | null | 0 | 0 | |
open-llm-leaderboard/details_Undi95__ReMM-v2-L2-13B | 2023-09-18T14:00:08.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Undi95/ReMM-v2-L2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/ReMM-v2-L2-13B](https://huggingface.co/Undi95/ReMM-v2-L2-13B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__ReMM-v2-L2-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T13:58:45.934639](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-v2-L2-13B/blob/main/results_2023-09-18T13-58-45.934639.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5632990571605488,\n\
\ \"acc_stderr\": 0.0344139510970497,\n \"acc_norm\": 0.5670865827687058,\n\
\ \"acc_norm_stderr\": 0.03439187060727053,\n \"mc1\": 0.3659730722154223,\n\
\ \"mc1_stderr\": 0.016862941684088376,\n \"mc2\": 0.5081127343633631,\n\
\ \"mc2_stderr\": 0.015610906083140244\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225402,\n\
\ \"acc_norm\": 0.6194539249146758,\n \"acc_norm_stderr\": 0.014188277712349812\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6480780720971918,\n\
\ \"acc_stderr\": 0.0047659375151971875,\n \"acc_norm\": 0.8399721171081458,\n\
\ \"acc_norm_stderr\": 0.0036588262081016167\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.040657710025626036,\n\
\ \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.040657710025626036\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.030151134457776285,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.030151134457776285\n \
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670788,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670788\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.03807301726504511,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.03807301726504511\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.04336432707993179,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.04336432707993179\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4851063829787234,\n \"acc_stderr\": 0.032671518489247764,\n\
\ \"acc_norm\": 0.4851063829787234,\n \"acc_norm_stderr\": 0.032671518489247764\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596433,\n \"\
acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596433\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6387096774193548,\n\
\ \"acc_stderr\": 0.027327548447957532,\n \"acc_norm\": 0.6387096774193548,\n\
\ \"acc_norm_stderr\": 0.027327548447957532\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.034819048444388045,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.034819048444388045\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.037131580674819135,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.037131580674819135\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7070707070707071,\n \"acc_stderr\": 0.032424979581788166,\n \"\
acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.032424979581788166\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.02951928261681724,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.02951928261681724\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5205128205128206,\n \"acc_stderr\": 0.02532966316348994,\n \
\ \"acc_norm\": 0.5205128205128206,\n \"acc_norm_stderr\": 0.02532966316348994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.0284934650910286,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.0284934650910286\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.03196876989195778,\n \
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.03196876989195778\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.726605504587156,\n \"acc_stderr\": 0.019109299846098285,\n \"\
acc_norm\": 0.726605504587156,\n \"acc_norm_stderr\": 0.019109299846098285\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.033622774366080445,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.033622774366080445\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302873,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302873\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.045723723587374296,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.045723723587374296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503948,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503948\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.026246772946890484,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.026246772946890484\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7662835249042146,\n\
\ \"acc_stderr\": 0.01513338327898883,\n \"acc_norm\": 0.7662835249042146,\n\
\ \"acc_norm_stderr\": 0.01513338327898883\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6329479768786127,\n \"acc_stderr\": 0.02595005433765408,\n\
\ \"acc_norm\": 0.6329479768786127,\n \"acc_norm_stderr\": 0.02595005433765408\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4491620111731844,\n\
\ \"acc_stderr\": 0.016635838341631914,\n \"acc_norm\": 0.4491620111731844,\n\
\ \"acc_norm_stderr\": 0.016635838341631914\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.027826109307283693,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.027826109307283693\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.639871382636656,\n\
\ \"acc_stderr\": 0.027264297599804012,\n \"acc_norm\": 0.639871382636656,\n\
\ \"acc_norm_stderr\": 0.027264297599804012\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.027002521034516475,\n\
\ \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.027002521034516475\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.02949482760014437,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.02949482760014437\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42698826597131684,\n\
\ \"acc_stderr\": 0.012633353557534423,\n \"acc_norm\": 0.42698826597131684,\n\
\ \"acc_norm_stderr\": 0.012633353557534423\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.03029061918048569,\n\
\ \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.03029061918048569\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5833333333333334,\n \"acc_stderr\": 0.019944914136873583,\n \
\ \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.019944914136873583\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.03106721126287247,\n\
\ \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.03106721126287247\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7064676616915423,\n\
\ \"acc_stderr\": 0.032200241045342054,\n \"acc_norm\": 0.7064676616915423,\n\
\ \"acc_norm_stderr\": 0.032200241045342054\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826369,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826369\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3659730722154223,\n\
\ \"mc1_stderr\": 0.016862941684088376,\n \"mc2\": 0.5081127343633631,\n\
\ \"mc2_stderr\": 0.015610906083140244\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/ReMM-v2-L2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-58-45.934639.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-58-45.934639.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-58-45.934639.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-58-45.934639.parquet'
- config_name: results
data_files:
- split: 2023_09_18T13_58_45.934639
path:
- results_2023-09-18T13-58-45.934639.parquet
- split: latest
path:
- results_2023-09-18T13-58-45.934639.parquet
---
# Dataset Card for Evaluation run of Undi95/ReMM-v2-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/ReMM-v2-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/ReMM-v2-L2-13B](https://huggingface.co/Undi95/ReMM-v2-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__ReMM-v2-L2-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T13:58:45.934639](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-v2-L2-13B/blob/main/results_2023-09-18T13-58-45.934639.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5632990571605488,
"acc_stderr": 0.0344139510970497,
"acc_norm": 0.5670865827687058,
"acc_norm_stderr": 0.03439187060727053,
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088376,
"mc2": 0.5081127343633631,
"mc2_stderr": 0.015610906083140244
},
"harness|arc:challenge|25": {
"acc": 0.5878839590443686,
"acc_stderr": 0.014383915302225402,
"acc_norm": 0.6194539249146758,
"acc_norm_stderr": 0.014188277712349812
},
"harness|hellaswag|10": {
"acc": 0.6480780720971918,
"acc_stderr": 0.0047659375151971875,
"acc_norm": 0.8399721171081458,
"acc_norm_stderr": 0.0036588262081016167
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5197368421052632,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.5197368421052632,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6,
"acc_stderr": 0.030151134457776285,
"acc_norm": 0.6,
"acc_norm_stderr": 0.030151134457776285
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670788,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670788
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.03807301726504511,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.03807301726504511
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.04336432707993179,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.04336432707993179
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4851063829787234,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.4851063829787234,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3439153439153439,
"acc_stderr": 0.024464426625596433,
"acc_norm": 0.3439153439153439,
"acc_norm_stderr": 0.024464426625596433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6387096774193548,
"acc_stderr": 0.027327548447957532,
"acc_norm": 0.6387096774193548,
"acc_norm_stderr": 0.027327548447957532
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.034819048444388045,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.034819048444388045
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.037131580674819135,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.037131580674819135
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.032424979581788166,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.032424979581788166
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.02951928261681724,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.02951928261681724
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5205128205128206,
"acc_stderr": 0.02532966316348994,
"acc_norm": 0.5205128205128206,
"acc_norm_stderr": 0.02532966316348994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.0284934650910286,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.0284934650910286
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.03196876989195778,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.03196876989195778
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.726605504587156,
"acc_stderr": 0.019109299846098285,
"acc_norm": 0.726605504587156,
"acc_norm_stderr": 0.019109299846098285
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.033622774366080445,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.033622774366080445
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302873,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302873
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.045723723587374296,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.045723723587374296
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503948,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503948
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.026246772946890484,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.026246772946890484
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7662835249042146,
"acc_stderr": 0.01513338327898883,
"acc_norm": 0.7662835249042146,
"acc_norm_stderr": 0.01513338327898883
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6329479768786127,
"acc_stderr": 0.02595005433765408,
"acc_norm": 0.6329479768786127,
"acc_norm_stderr": 0.02595005433765408
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4491620111731844,
"acc_stderr": 0.016635838341631914,
"acc_norm": 0.4491620111731844,
"acc_norm_stderr": 0.016635838341631914
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.027826109307283693,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.027826109307283693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.639871382636656,
"acc_stderr": 0.027264297599804012,
"acc_norm": 0.639871382636656,
"acc_norm_stderr": 0.027264297599804012
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.027002521034516475,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.027002521034516475
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.02949482760014437,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.02949482760014437
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42698826597131684,
"acc_stderr": 0.012633353557534423,
"acc_norm": 0.42698826597131684,
"acc_norm_stderr": 0.012633353557534423
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5367647058823529,
"acc_stderr": 0.03029061918048569,
"acc_norm": 0.5367647058823529,
"acc_norm_stderr": 0.03029061918048569
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.019944914136873583,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.019944914136873583
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6204081632653061,
"acc_stderr": 0.03106721126287247,
"acc_norm": 0.6204081632653061,
"acc_norm_stderr": 0.03106721126287247
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7064676616915423,
"acc_stderr": 0.032200241045342054,
"acc_norm": 0.7064676616915423,
"acc_norm_stderr": 0.032200241045342054
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826369,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826369
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088376,
"mc2": 0.5081127343633631,
"mc2_stderr": 0.015610906083140244
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Doctor-Shotgun__CalliopeDS-L2-13B | 2023-09-18T14:02:14.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Doctor-Shotgun/CalliopeDS-L2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Doctor-Shotgun/CalliopeDS-L2-13B](https://huggingface.co/Doctor-Shotgun/CalliopeDS-L2-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Doctor-Shotgun__CalliopeDS-L2-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T14:00:51.912601](https://huggingface.co/datasets/open-llm-leaderboard/details_Doctor-Shotgun__CalliopeDS-L2-13B/blob/main/results_2023-09-18T14-00-51.912601.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5593935221127917,\n\
\ \"acc_stderr\": 0.03444564939235036,\n \"acc_norm\": 0.5634274329296816,\n\
\ \"acc_norm_stderr\": 0.0344238498265519,\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.016838862883965838,\n \"mc2\": 0.5132465172345371,\n\
\ \"mc2_stderr\": 0.015659242228106055\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5674061433447098,\n \"acc_stderr\": 0.014478005694182524,\n\
\ \"acc_norm\": 0.6049488054607508,\n \"acc_norm_stderr\": 0.01428589829293817\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6333399721171081,\n\
\ \"acc_stderr\": 0.004809077205343493,\n \"acc_norm\": 0.8337980481975702,\n\
\ \"acc_norm_stderr\": 0.0037150102244786136\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480863,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480863\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309173,\n\
\ \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309173\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.03032594578928611,\n\
\ \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.03032594578928611\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.04174752578923183,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.04174752578923183\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5028901734104047,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.5028901734104047,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.328042328042328,\n \"acc_stderr\": 0.024180497164376907,\n \"\
acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.024180497164376907\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6193548387096774,\n \"acc_stderr\": 0.027621717832907036,\n \"\
acc_norm\": 0.6193548387096774,\n \"acc_norm_stderr\": 0.027621717832907036\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n \"\
acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.696969696969697,\n \"acc_stderr\": 0.032742879140268674,\n \"\
acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.032742879140268674\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.029519282616817244,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.029519282616817244\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.48717948717948717,\n \"acc_stderr\": 0.025342671293807257,\n\
\ \"acc_norm\": 0.48717948717948717,\n \"acc_norm_stderr\": 0.025342671293807257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7394495412844037,\n\
\ \"acc_stderr\": 0.01881918203485007,\n \"acc_norm\": 0.7394495412844037,\n\
\ \"acc_norm_stderr\": 0.01881918203485007\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653063,\n\
\ \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653063\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159263,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159263\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864616,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864616\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8247863247863247,\n\
\ \"acc_stderr\": 0.02490443909891824,\n \"acc_norm\": 0.8247863247863247,\n\
\ \"acc_norm_stderr\": 0.02490443909891824\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7662835249042146,\n\
\ \"acc_stderr\": 0.015133383278988827,\n \"acc_norm\": 0.7662835249042146,\n\
\ \"acc_norm_stderr\": 0.015133383278988827\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6040462427745664,\n \"acc_stderr\": 0.02632981334194625,\n\
\ \"acc_norm\": 0.6040462427745664,\n \"acc_norm_stderr\": 0.02632981334194625\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4782122905027933,\n\
\ \"acc_stderr\": 0.01670661752217613,\n \"acc_norm\": 0.4782122905027933,\n\
\ \"acc_norm_stderr\": 0.01670661752217613\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.028180596328259287,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.028180596328259287\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n\
\ \"acc_stderr\": 0.027648149599751464,\n \"acc_norm\": 0.6141479099678456,\n\
\ \"acc_norm_stderr\": 0.027648149599751464\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6141975308641975,\n \"acc_stderr\": 0.027085401226132143,\n\
\ \"acc_norm\": 0.6141975308641975,\n \"acc_norm_stderr\": 0.027085401226132143\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.029392236584612496,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.029392236584612496\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4198174706649283,\n\
\ \"acc_stderr\": 0.012604960816087375,\n \"acc_norm\": 0.4198174706649283,\n\
\ \"acc_norm_stderr\": 0.012604960816087375\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.49264705882352944,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.49264705882352944,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5620915032679739,\n \"acc_stderr\": 0.020071257886886525,\n \
\ \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.020071257886886525\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.031067211262872478,\n\
\ \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.031067211262872478\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533204,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533204\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.016838862883965838,\n \"mc2\": 0.5132465172345371,\n\
\ \"mc2_stderr\": 0.015659242228106055\n }\n}\n```"
repo_url: https://huggingface.co/Doctor-Shotgun/CalliopeDS-L2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-00-51.912601.parquet'
- config_name: results
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- results_2023-09-18T14-00-51.912601.parquet
- split: latest
path:
- results_2023-09-18T14-00-51.912601.parquet
---
# Dataset Card for Evaluation run of Doctor-Shotgun/CalliopeDS-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Doctor-Shotgun/CalliopeDS-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Doctor-Shotgun/CalliopeDS-L2-13B](https://huggingface.co/Doctor-Shotgun/CalliopeDS-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Doctor-Shotgun__CalliopeDS-L2-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T14:00:51.912601](https://huggingface.co/datasets/open-llm-leaderboard/details_Doctor-Shotgun__CalliopeDS-L2-13B/blob/main/results_2023-09-18T14-00-51.912601.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5593935221127917,
"acc_stderr": 0.03444564939235036,
"acc_norm": 0.5634274329296816,
"acc_norm_stderr": 0.0344238498265519,
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965838,
"mc2": 0.5132465172345371,
"mc2_stderr": 0.015659242228106055
},
"harness|arc:challenge|25": {
"acc": 0.5674061433447098,
"acc_stderr": 0.014478005694182524,
"acc_norm": 0.6049488054607508,
"acc_norm_stderr": 0.01428589829293817
},
"harness|hellaswag|10": {
"acc": 0.6333399721171081,
"acc_stderr": 0.004809077205343493,
"acc_norm": 0.8337980481975702,
"acc_norm_stderr": 0.0037150102244786136
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480863,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480863
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309173,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309173
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04174752578923183,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04174752578923183
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.024180497164376907,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.024180497164376907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6193548387096774,
"acc_stderr": 0.027621717832907036,
"acc_norm": 0.6193548387096774,
"acc_norm_stderr": 0.027621717832907036
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.032742879140268674,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.032742879140268674
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.029519282616817244,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.029519282616817244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.48717948717948717,
"acc_stderr": 0.025342671293807257,
"acc_norm": 0.48717948717948717,
"acc_norm_stderr": 0.025342671293807257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7394495412844037,
"acc_stderr": 0.01881918203485007,
"acc_norm": 0.7394495412844037,
"acc_norm_stderr": 0.01881918203485007
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.03372343271653063,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.03372343271653063
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159263,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159263
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.036803503712864616,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.036803503712864616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8247863247863247,
"acc_stderr": 0.02490443909891824,
"acc_norm": 0.8247863247863247,
"acc_norm_stderr": 0.02490443909891824
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7662835249042146,
"acc_stderr": 0.015133383278988827,
"acc_norm": 0.7662835249042146,
"acc_norm_stderr": 0.015133383278988827
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6040462427745664,
"acc_stderr": 0.02632981334194625,
"acc_norm": 0.6040462427745664,
"acc_norm_stderr": 0.02632981334194625
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4782122905027933,
"acc_stderr": 0.01670661752217613,
"acc_norm": 0.4782122905027933,
"acc_norm_stderr": 0.01670661752217613
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.028180596328259287,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.028180596328259287
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6141479099678456,
"acc_stderr": 0.027648149599751464,
"acc_norm": 0.6141479099678456,
"acc_norm_stderr": 0.027648149599751464
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6141975308641975,
"acc_stderr": 0.027085401226132143,
"acc_norm": 0.6141975308641975,
"acc_norm_stderr": 0.027085401226132143
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.029392236584612496,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.029392236584612496
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4198174706649283,
"acc_stderr": 0.012604960816087375,
"acc_norm": 0.4198174706649283,
"acc_norm_stderr": 0.012604960816087375
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.49264705882352944,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.49264705882352944,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.020071257886886525,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.020071257886886525
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6204081632653061,
"acc_stderr": 0.031067211262872478,
"acc_norm": 0.6204081632653061,
"acc_norm_stderr": 0.031067211262872478
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7213930348258707,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.7213930348258707,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533204,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533204
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965838,
"mc2": 0.5132465172345371,
"mc2_stderr": 0.015659242228106055
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
DaisyStar004/iCliniq_data | 2023-09-18T14:14:49.000Z | [
"region:us"
] | DaisyStar004 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7579290
num_examples: 7321
download_size: 4355411
dataset_size: 7579290
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "iCliniq_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
barto17/gtzan_all_preprocessed_kaggle_version | 2023-09-18T14:56:06.000Z | [
"region:us"
] | barto17 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: label
dtype:
class_label:
names:
'0': blues
'1': classical
'2': country
'3': disco
'4': hiphop
'5': jazz
'6': metal
'7': pop
'8': reggae
'9': rock
- name: input_values
sequence: float32
- name: attention_mask
sequence: int32
splits:
- name: train
num_bytes: 3452159816
num_examples: 899
- name: test
num_bytes: 384000696
num_examples: 100
download_size: 1923103931
dataset_size: 3836160512
---
# Dataset Card for "gtzan_all_preprocessed_kaggle_version"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Chesf/J | 2023-09-20T06:36:25.000Z | [
"region:us"
] | Chesf | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_teknium__OpenHermes-7B | 2023-09-18T14:10:23.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of teknium/OpenHermes-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [teknium/OpenHermes-7B](https://huggingface.co/teknium/OpenHermes-7B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_teknium__OpenHermes-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T14:09:00.502210](https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__OpenHermes-7B/blob/main/results_2023-09-18T14-09-00.502210.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4886592917372928,\n\
\ \"acc_stderr\": 0.03506569549642699,\n \"acc_norm\": 0.49248755559863605,\n\
\ \"acc_norm_stderr\": 0.035050737718166664,\n \"mc1\": 0.29498164014687883,\n\
\ \"mc1_stderr\": 0.015964400965589657,\n \"mc2\": 0.44995354312872166,\n\
\ \"mc2_stderr\": 0.014767124906788017\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5281569965870307,\n \"acc_stderr\": 0.014588204105102203,\n\
\ \"acc_norm\": 0.5614334470989761,\n \"acc_norm_stderr\": 0.014500682618212865\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.59061939852619,\n \
\ \"acc_stderr\": 0.00490714622934755,\n \"acc_norm\": 0.7832105158334993,\n\
\ \"acc_norm_stderr\": 0.004112158798877644\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5169811320754717,\n \"acc_stderr\": 0.030755120364119905,\n\
\ \"acc_norm\": 0.5169811320754717,\n \"acc_norm_stderr\": 0.030755120364119905\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4791666666666667,\n\
\ \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.4791666666666667,\n\
\ \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4393063583815029,\n\
\ \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.4393063583815029,\n\
\ \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.0379328118530781,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.0379328118530781\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29894179894179895,\n \"acc_stderr\": 0.023577604791655823,\n \"\
acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.023577604791655823\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5419354838709678,\n\
\ \"acc_stderr\": 0.028343787250540618,\n \"acc_norm\": 0.5419354838709678,\n\
\ \"acc_norm_stderr\": 0.028343787250540618\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.0336612448905145,\n\
\ \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.0336612448905145\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.03756335775187897,\n\
\ \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03756335775187897\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6111111111111112,\n \"acc_stderr\": 0.0347327959083696,\n \"acc_norm\"\
: 0.6111111111111112,\n \"acc_norm_stderr\": 0.0347327959083696\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7098445595854922,\n \"acc_stderr\": 0.03275264467791516,\n\
\ \"acc_norm\": 0.7098445595854922,\n \"acc_norm_stderr\": 0.03275264467791516\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.44358974358974357,\n \"acc_stderr\": 0.025189149894764198,\n\
\ \"acc_norm\": 0.44358974358974357,\n \"acc_norm_stderr\": 0.025189149894764198\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.0322529423239964,\n \
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6678899082568808,\n \"acc_stderr\": 0.020192682985423337,\n \"\
acc_norm\": 0.6678899082568808,\n \"acc_norm_stderr\": 0.020192682985423337\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3194444444444444,\n \"acc_stderr\": 0.0317987634217685,\n \"acc_norm\"\
: 0.3194444444444444,\n \"acc_norm_stderr\": 0.0317987634217685\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6470588235294118,\n\
\ \"acc_stderr\": 0.03354092437591519,\n \"acc_norm\": 0.6470588235294118,\n\
\ \"acc_norm_stderr\": 0.03354092437591519\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.6835443037974683,\n \"acc_stderr\": 0.030274974880218977,\n\
\ \"acc_norm\": 0.6835443037974683,\n \"acc_norm_stderr\": 0.030274974880218977\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5919282511210763,\n\
\ \"acc_stderr\": 0.03298574607842822,\n \"acc_norm\": 0.5919282511210763,\n\
\ \"acc_norm_stderr\": 0.03298574607842822\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\
: 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n\
\ \"acc_stderr\": 0.04812917324536823,\n \"acc_norm\": 0.5462962962962963,\n\
\ \"acc_norm_stderr\": 0.04812917324536823\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.48466257668711654,\n \"acc_stderr\": 0.039265223787088424,\n\
\ \"acc_norm\": 0.48466257668711654,\n \"acc_norm_stderr\": 0.039265223787088424\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.04721188506097172,\n\
\ \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.04721188506097172\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7264957264957265,\n\
\ \"acc_stderr\": 0.02920254015343118,\n \"acc_norm\": 0.7264957264957265,\n\
\ \"acc_norm_stderr\": 0.02920254015343118\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6679438058748404,\n\
\ \"acc_stderr\": 0.016841174655295724,\n \"acc_norm\": 0.6679438058748404,\n\
\ \"acc_norm_stderr\": 0.016841174655295724\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5317919075144508,\n \"acc_stderr\": 0.026864624366756646,\n\
\ \"acc_norm\": 0.5317919075144508,\n \"acc_norm_stderr\": 0.026864624366756646\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.264804469273743,\n\
\ \"acc_stderr\": 0.014756906483260657,\n \"acc_norm\": 0.264804469273743,\n\
\ \"acc_norm_stderr\": 0.014756906483260657\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.028607893699576066,\n\
\ \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.028607893699576066\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5819935691318328,\n\
\ \"acc_stderr\": 0.028013651891995076,\n \"acc_norm\": 0.5819935691318328,\n\
\ \"acc_norm_stderr\": 0.028013651891995076\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5432098765432098,\n \"acc_stderr\": 0.027716661650194038,\n\
\ \"acc_norm\": 0.5432098765432098,\n \"acc_norm_stderr\": 0.027716661650194038\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3829787234042553,\n \"acc_stderr\": 0.028999080904806178,\n \
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.028999080904806178\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3683181225554107,\n\
\ \"acc_stderr\": 0.012319403369564639,\n \"acc_norm\": 0.3683181225554107,\n\
\ \"acc_norm_stderr\": 0.012319403369564639\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.45588235294117646,\n \"acc_stderr\": 0.020148939420415738,\n \
\ \"acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.020148939420415738\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5510204081632653,\n \"acc_stderr\": 0.03184213866687579,\n\
\ \"acc_norm\": 0.5510204081632653,\n \"acc_norm_stderr\": 0.03184213866687579\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6417910447761194,\n\
\ \"acc_stderr\": 0.03390393042268813,\n \"acc_norm\": 0.6417910447761194,\n\
\ \"acc_norm_stderr\": 0.03390393042268813\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.03488647713457922,\n\
\ \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.03488647713457922\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29498164014687883,\n\
\ \"mc1_stderr\": 0.015964400965589657,\n \"mc2\": 0.44995354312872166,\n\
\ \"mc2_stderr\": 0.014767124906788017\n }\n}\n```"
repo_url: https://huggingface.co/teknium/OpenHermes-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-09-00.502210.parquet'
- config_name: results
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- results_2023-09-18T14-09-00.502210.parquet
- split: latest
path:
- results_2023-09-18T14-09-00.502210.parquet
---
# Dataset Card for Evaluation run of teknium/OpenHermes-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/teknium/OpenHermes-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [teknium/OpenHermes-7B](https://huggingface.co/teknium/OpenHermes-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_teknium__OpenHermes-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T14:09:00.502210](https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__OpenHermes-7B/blob/main/results_2023-09-18T14-09-00.502210.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4886592917372928,
"acc_stderr": 0.03506569549642699,
"acc_norm": 0.49248755559863605,
"acc_norm_stderr": 0.035050737718166664,
"mc1": 0.29498164014687883,
"mc1_stderr": 0.015964400965589657,
"mc2": 0.44995354312872166,
"mc2_stderr": 0.014767124906788017
},
"harness|arc:challenge|25": {
"acc": 0.5281569965870307,
"acc_stderr": 0.014588204105102203,
"acc_norm": 0.5614334470989761,
"acc_norm_stderr": 0.014500682618212865
},
"harness|hellaswag|10": {
"acc": 0.59061939852619,
"acc_stderr": 0.00490714622934755,
"acc_norm": 0.7832105158334993,
"acc_norm_stderr": 0.004112158798877644
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4144736842105263,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.4144736842105263,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5169811320754717,
"acc_stderr": 0.030755120364119905,
"acc_norm": 0.5169811320754717,
"acc_norm_stderr": 0.030755120364119905
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4791666666666667,
"acc_stderr": 0.041775789507399935,
"acc_norm": 0.4791666666666667,
"acc_norm_stderr": 0.041775789507399935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4393063583815029,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.4393063583815029,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.0379328118530781,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.0379328118530781
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29894179894179895,
"acc_stderr": 0.023577604791655823,
"acc_norm": 0.29894179894179895,
"acc_norm_stderr": 0.023577604791655823
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5419354838709678,
"acc_stderr": 0.028343787250540618,
"acc_norm": 0.5419354838709678,
"acc_norm_stderr": 0.028343787250540618
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.0336612448905145,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.0336612448905145
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03756335775187897,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03756335775187897
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.0347327959083696,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.0347327959083696
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7098445595854922,
"acc_stderr": 0.03275264467791516,
"acc_norm": 0.7098445595854922,
"acc_norm_stderr": 0.03275264467791516
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44358974358974357,
"acc_stderr": 0.025189149894764198,
"acc_norm": 0.44358974358974357,
"acc_norm_stderr": 0.025189149894764198
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6678899082568808,
"acc_stderr": 0.020192682985423337,
"acc_norm": 0.6678899082568808,
"acc_norm_stderr": 0.020192682985423337
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3194444444444444,
"acc_stderr": 0.0317987634217685,
"acc_norm": 0.3194444444444444,
"acc_norm_stderr": 0.0317987634217685
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.03354092437591519,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.03354092437591519
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6835443037974683,
"acc_stderr": 0.030274974880218977,
"acc_norm": 0.6835443037974683,
"acc_norm_stderr": 0.030274974880218977
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5919282511210763,
"acc_stderr": 0.03298574607842822,
"acc_norm": 0.5919282511210763,
"acc_norm_stderr": 0.03298574607842822
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536823,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536823
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.48466257668711654,
"acc_stderr": 0.039265223787088424,
"acc_norm": 0.48466257668711654,
"acc_norm_stderr": 0.039265223787088424
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.6504854368932039,
"acc_stderr": 0.04721188506097172,
"acc_norm": 0.6504854368932039,
"acc_norm_stderr": 0.04721188506097172
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7264957264957265,
"acc_stderr": 0.02920254015343118,
"acc_norm": 0.7264957264957265,
"acc_norm_stderr": 0.02920254015343118
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6679438058748404,
"acc_stderr": 0.016841174655295724,
"acc_norm": 0.6679438058748404,
"acc_norm_stderr": 0.016841174655295724
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.026864624366756646,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.026864624366756646
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.264804469273743,
"acc_stderr": 0.014756906483260657,
"acc_norm": 0.264804469273743,
"acc_norm_stderr": 0.014756906483260657
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.028607893699576066,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.028607893699576066
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5819935691318328,
"acc_stderr": 0.028013651891995076,
"acc_norm": 0.5819935691318328,
"acc_norm_stderr": 0.028013651891995076
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5432098765432098,
"acc_stderr": 0.027716661650194038,
"acc_norm": 0.5432098765432098,
"acc_norm_stderr": 0.027716661650194038
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.028999080904806178,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.028999080904806178
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3683181225554107,
"acc_stderr": 0.012319403369564639,
"acc_norm": 0.3683181225554107,
"acc_norm_stderr": 0.012319403369564639
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.45588235294117646,
"acc_stderr": 0.020148939420415738,
"acc_norm": 0.45588235294117646,
"acc_norm_stderr": 0.020148939420415738
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5510204081632653,
"acc_stderr": 0.03184213866687579,
"acc_norm": 0.5510204081632653,
"acc_norm_stderr": 0.03184213866687579
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6417910447761194,
"acc_stderr": 0.03390393042268813,
"acc_norm": 0.6417910447761194,
"acc_norm_stderr": 0.03390393042268813
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7076023391812866,
"acc_stderr": 0.03488647713457922,
"acc_norm": 0.7076023391812866,
"acc_norm_stderr": 0.03488647713457922
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29498164014687883,
"mc1_stderr": 0.015964400965589657,
"mc2": 0.44995354312872166,
"mc2_stderr": 0.014767124906788017
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
seohyun-kang-ringle/pronunciation | 2023-09-18T14:11:34.000Z | [
"region:us"
] | seohyun-kang-ringle | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_lgaalves__llama-2-13b-hf-platypus | 2023-09-18T14:17:09.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of lgaalves/llama-2-13b-hf-platypus
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lgaalves/llama-2-13b-hf-platypus](https://huggingface.co/lgaalves/llama-2-13b-hf-platypus)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lgaalves__llama-2-13b-hf-platypus\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T14:15:46.670153](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__llama-2-13b-hf-platypus/blob/main/results_2023-09-18T14-15-46.670153.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5509125885849774,\n\
\ \"acc_stderr\": 0.0344588285887975,\n \"acc_norm\": 0.555047768984873,\n\
\ \"acc_norm_stderr\": 0.03443868276596075,\n \"mc1\": 0.2876376988984088,\n\
\ \"mc1_stderr\": 0.015846315101394802,\n \"mc2\": 0.4284193316007184,\n\
\ \"mc2_stderr\": 0.014486178746194435\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5511945392491467,\n \"acc_stderr\": 0.014534599585097664,\n\
\ \"acc_norm\": 0.5887372013651877,\n \"acc_norm_stderr\": 0.014379441068522082\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6149173471420036,\n\
\ \"acc_stderr\": 0.004856203374715453,\n \"acc_norm\": 0.8213503286197968,\n\
\ \"acc_norm_stderr\": 0.003822758343922915\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.030285009259009794,\n\
\ \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.030285009259009794\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.041657747757287644,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.041657747757287644\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596433,\n \"\
acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596433\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147126,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147126\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n\
\ \"acc_stderr\": 0.026923446059302844,\n \"acc_norm\": 0.6612903225806451,\n\
\ \"acc_norm_stderr\": 0.026923446059302844\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.035025446508458714,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.035025446508458714\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.696969696969697,\n \"acc_stderr\": 0.032742879140268674,\n \"\
acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.032742879140268674\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5153846153846153,\n \"acc_stderr\": 0.025339003010106515,\n\
\ \"acc_norm\": 0.5153846153846153,\n \"acc_norm_stderr\": 0.025339003010106515\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340496,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.03218358107742613,\n \
\ \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.03218358107742613\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7577981651376147,\n \"acc_stderr\": 0.01836817630659862,\n \"\
acc_norm\": 0.7577981651376147,\n \"acc_norm_stderr\": 0.01836817630659862\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.033509916046960415,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.033509916046960415\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695053,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695053\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.0368035037128646,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.0368035037128646\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.02645350805404033,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.02645350805404033\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7509578544061303,\n\
\ \"acc_stderr\": 0.015464676163395958,\n \"acc_norm\": 0.7509578544061303,\n\
\ \"acc_norm_stderr\": 0.015464676163395958\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36983240223463687,\n\
\ \"acc_stderr\": 0.016145881256056215,\n \"acc_norm\": 0.36983240223463687,\n\
\ \"acc_norm_stderr\": 0.016145881256056215\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6013071895424836,\n \"acc_stderr\": 0.028036092273891776,\n\
\ \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.028036092273891776\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n\
\ \"acc_stderr\": 0.027368078243971635,\n \"acc_norm\": 0.6334405144694534,\n\
\ \"acc_norm_stderr\": 0.027368078243971635\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.026869490744815257,\n\
\ \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.026869490744815257\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573086,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573086\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4041720990873533,\n\
\ \"acc_stderr\": 0.012533504046491362,\n \"acc_norm\": 0.4041720990873533,\n\
\ \"acc_norm_stderr\": 0.012533504046491362\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5588235294117647,\n \"acc_stderr\": 0.020087362076702857,\n \
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.020087362076702857\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.03168091161233882,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.03168091161233882\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.031157150869355568,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.031157150869355568\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.03401052620104089,\n\
\ \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.03401052620104089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2876376988984088,\n\
\ \"mc1_stderr\": 0.015846315101394802,\n \"mc2\": 0.4284193316007184,\n\
\ \"mc2_stderr\": 0.014486178746194435\n }\n}\n```"
repo_url: https://huggingface.co/lgaalves/llama-2-13b-hf-platypus
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-15-46.670153.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-15-46.670153.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-15-46.670153.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-15-46.670153.parquet'
- config_name: results
data_files:
- split: 2023_09_18T14_15_46.670153
path:
- results_2023-09-18T14-15-46.670153.parquet
- split: latest
path:
- results_2023-09-18T14-15-46.670153.parquet
---
# Dataset Card for Evaluation run of lgaalves/llama-2-13b-hf-platypus
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lgaalves/llama-2-13b-hf-platypus
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lgaalves/llama-2-13b-hf-platypus](https://huggingface.co/lgaalves/llama-2-13b-hf-platypus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lgaalves__llama-2-13b-hf-platypus",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T14:15:46.670153](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__llama-2-13b-hf-platypus/blob/main/results_2023-09-18T14-15-46.670153.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5509125885849774,
"acc_stderr": 0.0344588285887975,
"acc_norm": 0.555047768984873,
"acc_norm_stderr": 0.03443868276596075,
"mc1": 0.2876376988984088,
"mc1_stderr": 0.015846315101394802,
"mc2": 0.4284193316007184,
"mc2_stderr": 0.014486178746194435
},
"harness|arc:challenge|25": {
"acc": 0.5511945392491467,
"acc_stderr": 0.014534599585097664,
"acc_norm": 0.5887372013651877,
"acc_norm_stderr": 0.014379441068522082
},
"harness|hellaswag|10": {
"acc": 0.6149173471420036,
"acc_stderr": 0.004856203374715453,
"acc_norm": 0.8213503286197968,
"acc_norm_stderr": 0.003822758343922915
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5886792452830188,
"acc_stderr": 0.030285009259009794,
"acc_norm": 0.5886792452830188,
"acc_norm_stderr": 0.030285009259009794
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.041657747757287644,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.041657747757287644
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3439153439153439,
"acc_stderr": 0.024464426625596433,
"acc_norm": 0.3439153439153439,
"acc_norm_stderr": 0.024464426625596433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147126,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147126
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6612903225806451,
"acc_stderr": 0.026923446059302844,
"acc_norm": 0.6612903225806451,
"acc_norm_stderr": 0.026923446059302844
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.035025446508458714,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.035025446508458714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481913,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481913
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.032742879140268674,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.032742879140268674
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5153846153846153,
"acc_stderr": 0.025339003010106515,
"acc_norm": 0.5153846153846153,
"acc_norm_stderr": 0.025339003010106515
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340496,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.03218358107742613,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.03218358107742613
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7577981651376147,
"acc_stderr": 0.01836817630659862,
"acc_norm": 0.7577981651376147,
"acc_norm_stderr": 0.01836817630659862
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.033509916046960415,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.033509916046960415
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695053,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695053
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.0368035037128646,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.0368035037128646
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285713,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285713
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.02645350805404033,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.02645350805404033
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7509578544061303,
"acc_stderr": 0.015464676163395958,
"acc_norm": 0.7509578544061303,
"acc_norm_stderr": 0.015464676163395958
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.02607431485165708,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.02607431485165708
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36983240223463687,
"acc_stderr": 0.016145881256056215,
"acc_norm": 0.36983240223463687,
"acc_norm_stderr": 0.016145881256056215
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6013071895424836,
"acc_stderr": 0.028036092273891776,
"acc_norm": 0.6013071895424836,
"acc_norm_stderr": 0.028036092273891776
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.027368078243971635,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.027368078243971635
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.026869490744815257,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.026869490744815257
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.029233465745573086,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.029233465745573086
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4041720990873533,
"acc_stderr": 0.012533504046491362,
"acc_norm": 0.4041720990873533,
"acc_norm_stderr": 0.012533504046491362
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.020087362076702857,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.020087362076702857
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.03168091161233882,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.03168091161233882
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.031157150869355568,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.031157150869355568
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7309941520467836,
"acc_stderr": 0.03401052620104089,
"acc_norm": 0.7309941520467836,
"acc_norm_stderr": 0.03401052620104089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2876376988984088,
"mc1_stderr": 0.015846315101394802,
"mc2": 0.4284193316007184,
"mc2_stderr": 0.014486178746194435
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/kita_ikuyo_bocchitherock | 2023-09-18T14:22:56.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Kita Ikuyo
This is the dataset of Kita Ikuyo, containing 296 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 296 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 650 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 296 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 296 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 296 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 296 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 296 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 650 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 650 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 650 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
DaisyStar004/iCliniq-llama2-7k | 2023-09-18T14:18:37.000Z | [
"region:us"
] | DaisyStar004 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7229044
num_examples: 7000
download_size: 4177341
dataset_size: 7229044
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "iCliniq-llama2-7k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_marcchew__Marcoroni-7B-LaMini-40K | 2023-09-18T14:24:15.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of marcchew/Marcoroni-7B-LaMini-40K
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [marcchew/Marcoroni-7B-LaMini-40K](https://huggingface.co/marcchew/Marcoroni-7B-LaMini-40K)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_marcchew__Marcoroni-7B-LaMini-40K\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T14:22:48.761056](https://huggingface.co/datasets/open-llm-leaderboard/details_marcchew__Marcoroni-7B-LaMini-40K/blob/main/results_2023-09-18T14-22-48.761056.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26813486921423935,\n\
\ \"acc_stderr\": 0.03170123248402419,\n \"acc_norm\": 0.2691927993674212,\n\
\ \"acc_norm_stderr\": 0.031717489770405456,\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.0148430615077316,\n \"mc2\": 0.47395992315564384,\n\
\ \"mc2_stderr\": 0.01662842623454551\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2226962457337884,\n \"acc_stderr\": 0.01215831477482993,\n\
\ \"acc_norm\": 0.2764505119453925,\n \"acc_norm_stderr\": 0.013069662474252428\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25363473411670984,\n\
\ \"acc_stderr\": 0.004342017709967968,\n \"acc_norm\": 0.2622983469428401,\n\
\ \"acc_norm_stderr\": 0.004389849907040308\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.03633384414073461,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.03633384414073461\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3355263157894737,\n \"acc_stderr\": 0.03842498559395268,\n\
\ \"acc_norm\": 0.3355263157894737,\n \"acc_norm_stderr\": 0.03842498559395268\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2981132075471698,\n \"acc_stderr\": 0.028152837942493857,\n\
\ \"acc_norm\": 0.2981132075471698,\n \"acc_norm_stderr\": 0.028152837942493857\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3352601156069364,\n\
\ \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.3352601156069364,\n\
\ \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082633,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082633\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n\
\ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.029241883869628827,\n\
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.029241883869628827\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776564,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776564\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n\
\ \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n\
\ \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358609,\n\
\ \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358609\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\"\
: 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"\
acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n\
\ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3641025641025641,\n \"acc_stderr\": 0.02439667298509477,\n \
\ \"acc_norm\": 0.3641025641025641,\n \"acc_norm_stderr\": 0.02439667298509477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3487394957983193,\n \"acc_stderr\": 0.03095663632856655,\n \
\ \"acc_norm\": 0.3487394957983193,\n \"acc_norm_stderr\": 0.03095663632856655\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3486238532110092,\n \"acc_stderr\": 0.020431254090714328,\n \"\
acc_norm\": 0.3486238532110092,\n \"acc_norm_stderr\": 0.020431254090714328\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n\
\ \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.2549019607843137,\n\
\ \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.20253164556962025,\n \"acc_stderr\": 0.026160568246601457,\n\
\ \"acc_norm\": 0.20253164556962025,\n \"acc_norm_stderr\": 0.026160568246601457\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.10762331838565023,\n\
\ \"acc_stderr\": 0.020799400082879997,\n \"acc_norm\": 0.10762331838565023,\n\
\ \"acc_norm_stderr\": 0.020799400082879997\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.14049586776859505,\n \"acc_stderr\": 0.03172233426002161,\n \"\
acc_norm\": 0.14049586776859505,\n \"acc_norm_stderr\": 0.03172233426002161\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.033220157957767414,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.033220157957767414\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16071428571428573,\n\
\ \"acc_stderr\": 0.03485946096475741,\n \"acc_norm\": 0.16071428571428573,\n\
\ \"acc_norm_stderr\": 0.03485946096475741\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.039166677628225864,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.039166677628225864\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.22094508301404853,\n\
\ \"acc_stderr\": 0.014836205167333581,\n \"acc_norm\": 0.22094508301404853,\n\
\ \"acc_norm_stderr\": 0.014836205167333581\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757183,\n\
\ \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757183\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.02609016250427905,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.02609016250427905\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2347266881028939,\n\
\ \"acc_stderr\": 0.024071805887677048,\n \"acc_norm\": 0.2347266881028939,\n\
\ \"acc_norm_stderr\": 0.024071805887677048\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.023246202647819746,\n\
\ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.023246202647819746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24113475177304963,\n \"acc_stderr\": 0.025518731049537762,\n \
\ \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.025518731049537762\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24445893089960888,\n\
\ \"acc_stderr\": 0.010976425013113886,\n \"acc_norm\": 0.24445893089960888,\n\
\ \"acc_norm_stderr\": 0.010976425013113886\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2173202614379085,\n \"acc_stderr\": 0.01668482092914859,\n \
\ \"acc_norm\": 0.2173202614379085,\n \"acc_norm_stderr\": 0.01668482092914859\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072774,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072774\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.26865671641791045,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.26865671641791045,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.16265060240963855,\n\
\ \"acc_stderr\": 0.028730237892613798,\n \"acc_norm\": 0.16265060240963855,\n\
\ \"acc_norm_stderr\": 0.028730237892613798\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.26900584795321636,\n \"acc_stderr\": 0.0340105262010409,\n\
\ \"acc_norm\": 0.26900584795321636,\n \"acc_norm_stderr\": 0.0340105262010409\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.0148430615077316,\n \"mc2\": 0.47395992315564384,\n\
\ \"mc2_stderr\": 0.01662842623454551\n }\n}\n```"
repo_url: https://huggingface.co/marcchew/Marcoroni-7B-LaMini-40K
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-22-48.761056.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-22-48.761056.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-22-48.761056.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-22-48.761056.parquet'
- config_name: results
data_files:
- split: 2023_09_18T14_22_48.761056
path:
- results_2023-09-18T14-22-48.761056.parquet
- split: latest
path:
- results_2023-09-18T14-22-48.761056.parquet
---
# Dataset Card for Evaluation run of marcchew/Marcoroni-7B-LaMini-40K
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/marcchew/Marcoroni-7B-LaMini-40K
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [marcchew/Marcoroni-7B-LaMini-40K](https://huggingface.co/marcchew/Marcoroni-7B-LaMini-40K) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_marcchew__Marcoroni-7B-LaMini-40K",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T14:22:48.761056](https://huggingface.co/datasets/open-llm-leaderboard/details_marcchew__Marcoroni-7B-LaMini-40K/blob/main/results_2023-09-18T14-22-48.761056.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26813486921423935,
"acc_stderr": 0.03170123248402419,
"acc_norm": 0.2691927993674212,
"acc_norm_stderr": 0.031717489770405456,
"mc1": 0.2350061199510404,
"mc1_stderr": 0.0148430615077316,
"mc2": 0.47395992315564384,
"mc2_stderr": 0.01662842623454551
},
"harness|arc:challenge|25": {
"acc": 0.2226962457337884,
"acc_stderr": 0.01215831477482993,
"acc_norm": 0.2764505119453925,
"acc_norm_stderr": 0.013069662474252428
},
"harness|hellaswag|10": {
"acc": 0.25363473411670984,
"acc_stderr": 0.004342017709967968,
"acc_norm": 0.2622983469428401,
"acc_norm_stderr": 0.004389849907040308
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073461,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073461
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3355263157894737,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.3355263157894737,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2981132075471698,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.2981132075471698,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3352601156069364,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.3352601156069364,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082633,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082633
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.029241883869628827,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.029241883869628827
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.022789673145776564,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.022789673145776564
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.03161856335358609,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.03161856335358609
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3641025641025641,
"acc_stderr": 0.02439667298509477,
"acc_norm": 0.3641025641025641,
"acc_norm_stderr": 0.02439667298509477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3487394957983193,
"acc_stderr": 0.03095663632856655,
"acc_norm": 0.3487394957983193,
"acc_norm_stderr": 0.03095663632856655
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3486238532110092,
"acc_stderr": 0.020431254090714328,
"acc_norm": 0.3486238532110092,
"acc_norm_stderr": 0.020431254090714328
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.20253164556962025,
"acc_stderr": 0.026160568246601457,
"acc_norm": 0.20253164556962025,
"acc_norm_stderr": 0.026160568246601457
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.10762331838565023,
"acc_stderr": 0.020799400082879997,
"acc_norm": 0.10762331838565023,
"acc_norm_stderr": 0.020799400082879997
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.14049586776859505,
"acc_stderr": 0.03172233426002161,
"acc_norm": 0.14049586776859505,
"acc_norm_stderr": 0.03172233426002161
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.033220157957767414,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.033220157957767414
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16071428571428573,
"acc_stderr": 0.03485946096475741,
"acc_norm": 0.16071428571428573,
"acc_norm_stderr": 0.03485946096475741
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.039166677628225864,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.039166677628225864
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.22094508301404853,
"acc_stderr": 0.014836205167333581,
"acc_norm": 0.22094508301404853,
"acc_norm_stderr": 0.014836205167333581
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.022075709251757183,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.022075709251757183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.02609016250427905,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.02609016250427905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2347266881028939,
"acc_stderr": 0.024071805887677048,
"acc_norm": 0.2347266881028939,
"acc_norm_stderr": 0.024071805887677048
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.023246202647819746,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.023246202647819746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24113475177304963,
"acc_stderr": 0.025518731049537762,
"acc_norm": 0.24113475177304963,
"acc_norm_stderr": 0.025518731049537762
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24445893089960888,
"acc_stderr": 0.010976425013113886,
"acc_norm": 0.24445893089960888,
"acc_norm_stderr": 0.010976425013113886
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2173202614379085,
"acc_stderr": 0.01668482092914859,
"acc_norm": 0.2173202614379085,
"acc_norm_stderr": 0.01668482092914859
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072774,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072774
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.26865671641791045,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.26865671641791045,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.16265060240963855,
"acc_stderr": 0.028730237892613798,
"acc_norm": 0.16265060240963855,
"acc_norm_stderr": 0.028730237892613798
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.26900584795321636,
"acc_stderr": 0.0340105262010409,
"acc_norm": 0.26900584795321636,
"acc_norm_stderr": 0.0340105262010409
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2350061199510404,
"mc1_stderr": 0.0148430615077316,
"mc2": 0.47395992315564384,
"mc2_stderr": 0.01662842623454551
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Lazycuber__L2-7b-Guanaco-Uncensored | 2023-09-18T14:26:02.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Lazycuber/L2-7b-Guanaco-Uncensored
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Lazycuber/L2-7b-Guanaco-Uncensored](https://huggingface.co/Lazycuber/L2-7b-Guanaco-Uncensored)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Lazycuber__L2-7b-Guanaco-Uncensored\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T14:24:41.596109](https://huggingface.co/datasets/open-llm-leaderboard/details_Lazycuber__L2-7b-Guanaco-Uncensored/blob/main/results_2023-09-18T14-24-41.596109.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.49036600008197573,\n\
\ \"acc_stderr\": 0.03524623648458121,\n \"acc_norm\": 0.49432587695797897,\n\
\ \"acc_norm_stderr\": 0.035234360851138714,\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.015680929364024637,\n \"mc2\": 0.43423608772141165,\n\
\ \"mc2_stderr\": 0.01468977817324311\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4667235494880546,\n \"acc_stderr\": 0.01457899585960581,\n\
\ \"acc_norm\": 0.5059726962457338,\n \"acc_norm_stderr\": 0.014610348300255793\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5754829715196176,\n\
\ \"acc_stderr\": 0.004932593348813629,\n \"acc_norm\": 0.7698665604461262,\n\
\ \"acc_norm_stderr\": 0.004200578535056531\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621502,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621502\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.539622641509434,\n \"acc_stderr\": 0.03067609659938918,\n\
\ \"acc_norm\": 0.539622641509434,\n \"acc_norm_stderr\": 0.03067609659938918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5069444444444444,\n\
\ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.5069444444444444,\n\
\ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3988439306358382,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.3988439306358382,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224469,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224469\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.045595221419582166,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.045595221419582166\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29365079365079366,\n \"acc_stderr\": 0.02345603738398203,\n \"\
acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.02345603738398203\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.532258064516129,\n \"acc_stderr\": 0.028384747788813332,\n \"\
acc_norm\": 0.532258064516129,\n \"acc_norm_stderr\": 0.028384747788813332\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.39901477832512317,\n \"acc_stderr\": 0.03445487686264716,\n \"\
acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.03445487686264716\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.593939393939394,\n \"acc_stderr\": 0.03834816355401181,\n\
\ \"acc_norm\": 0.593939393939394,\n \"acc_norm_stderr\": 0.03834816355401181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6161616161616161,\n \"acc_stderr\": 0.03464881675016341,\n \"\
acc_norm\": 0.6161616161616161,\n \"acc_norm_stderr\": 0.03464881675016341\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7098445595854922,\n \"acc_stderr\": 0.03275264467791516,\n\
\ \"acc_norm\": 0.7098445595854922,\n \"acc_norm_stderr\": 0.03275264467791516\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.441025641025641,\n \"acc_stderr\": 0.025174048384000756,\n \
\ \"acc_norm\": 0.441025641025641,\n \"acc_norm_stderr\": 0.025174048384000756\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823018,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823018\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.0322529423239964,\n \
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.673394495412844,\n \"acc_stderr\": 0.020106990889937303,\n \"\
acc_norm\": 0.673394495412844,\n \"acc_norm_stderr\": 0.020106990889937303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.03167468706828978,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.03167468706828978\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236434,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236434\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6582278481012658,\n \"acc_stderr\": 0.03087453753755362,\n \
\ \"acc_norm\": 0.6582278481012658,\n \"acc_norm_stderr\": 0.03087453753755362\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n\
\ \"acc_stderr\": 0.03327283370271344,\n \"acc_norm\": 0.5650224215246636,\n\
\ \"acc_norm_stderr\": 0.03327283370271344\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n\
\ \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n\
\ \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5460122699386503,\n \"acc_stderr\": 0.0391170190467718,\n\
\ \"acc_norm\": 0.5460122699386503,\n \"acc_norm_stderr\": 0.0391170190467718\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.045416094465039476,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.045416094465039476\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7307692307692307,\n\
\ \"acc_stderr\": 0.029058588303748842,\n \"acc_norm\": 0.7307692307692307,\n\
\ \"acc_norm_stderr\": 0.029058588303748842\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6871008939974457,\n\
\ \"acc_stderr\": 0.01658093594030406,\n \"acc_norm\": 0.6871008939974457,\n\
\ \"acc_norm_stderr\": 0.01658093594030406\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5260115606936416,\n \"acc_stderr\": 0.02688264343402289,\n\
\ \"acc_norm\": 0.5260115606936416,\n \"acc_norm_stderr\": 0.02688264343402289\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2122905027932961,\n\
\ \"acc_stderr\": 0.013676644685831726,\n \"acc_norm\": 0.2122905027932961,\n\
\ \"acc_norm_stderr\": 0.013676644685831726\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5228758169934641,\n \"acc_stderr\": 0.028599936776089782,\n\
\ \"acc_norm\": 0.5228758169934641,\n \"acc_norm_stderr\": 0.028599936776089782\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5562700964630225,\n\
\ \"acc_stderr\": 0.028217683556652315,\n \"acc_norm\": 0.5562700964630225,\n\
\ \"acc_norm_stderr\": 0.028217683556652315\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5524691358024691,\n \"acc_stderr\": 0.0276671385694227,\n\
\ \"acc_norm\": 0.5524691358024691,\n \"acc_norm_stderr\": 0.0276671385694227\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.37943262411347517,\n \"acc_stderr\": 0.028947338851614105,\n \
\ \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.028947338851614105\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3455019556714472,\n\
\ \"acc_stderr\": 0.012145303004087206,\n \"acc_norm\": 0.3455019556714472,\n\
\ \"acc_norm_stderr\": 0.012145303004087206\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03035969707904611,\n\
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03035969707904611\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4869281045751634,\n \"acc_stderr\": 0.020220920829626923,\n \
\ \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.020220920829626923\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5795918367346938,\n \"acc_stderr\": 0.03160106993449601,\n\
\ \"acc_norm\": 0.5795918367346938,\n \"acc_norm_stderr\": 0.03160106993449601\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n\
\ \"acc_stderr\": 0.03410410565495302,\n \"acc_norm\": 0.6318407960199005,\n\
\ \"acc_norm_stderr\": 0.03410410565495302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n\
\ \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n\
\ \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.03446296217088427,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.03446296217088427\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.015680929364024637,\n \"mc2\": 0.43423608772141165,\n\
\ \"mc2_stderr\": 0.01468977817324311\n }\n}\n```"
repo_url: https://huggingface.co/Lazycuber/L2-7b-Guanaco-Uncensored
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-24-41.596109.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-24-41.596109.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-24-41.596109.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-24-41.596109.parquet'
- config_name: results
data_files:
- split: 2023_09_18T14_24_41.596109
path:
- results_2023-09-18T14-24-41.596109.parquet
- split: latest
path:
- results_2023-09-18T14-24-41.596109.parquet
---
# Dataset Card for Evaluation run of Lazycuber/L2-7b-Guanaco-Uncensored
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Lazycuber/L2-7b-Guanaco-Uncensored
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Lazycuber/L2-7b-Guanaco-Uncensored](https://huggingface.co/Lazycuber/L2-7b-Guanaco-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Lazycuber__L2-7b-Guanaco-Uncensored",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T14:24:41.596109](https://huggingface.co/datasets/open-llm-leaderboard/details_Lazycuber__L2-7b-Guanaco-Uncensored/blob/main/results_2023-09-18T14-24-41.596109.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.49036600008197573,
"acc_stderr": 0.03524623648458121,
"acc_norm": 0.49432587695797897,
"acc_norm_stderr": 0.035234360851138714,
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024637,
"mc2": 0.43423608772141165,
"mc2_stderr": 0.01468977817324311
},
"harness|arc:challenge|25": {
"acc": 0.4667235494880546,
"acc_stderr": 0.01457899585960581,
"acc_norm": 0.5059726962457338,
"acc_norm_stderr": 0.014610348300255793
},
"harness|hellaswag|10": {
"acc": 0.5754829715196176,
"acc_stderr": 0.004932593348813629,
"acc_norm": 0.7698665604461262,
"acc_norm_stderr": 0.004200578535056531
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621502,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621502
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.539622641509434,
"acc_stderr": 0.03067609659938918,
"acc_norm": 0.539622641509434,
"acc_norm_stderr": 0.03067609659938918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5069444444444444,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.5069444444444444,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179328,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179328
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.03232146916224469,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.03232146916224469
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.045595221419582166,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.045595221419582166
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.02345603738398203,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.02345603738398203
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523811,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523811
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.532258064516129,
"acc_stderr": 0.028384747788813332,
"acc_norm": 0.532258064516129,
"acc_norm_stderr": 0.028384747788813332
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39901477832512317,
"acc_stderr": 0.03445487686264716,
"acc_norm": 0.39901477832512317,
"acc_norm_stderr": 0.03445487686264716
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.593939393939394,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.593939393939394,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6161616161616161,
"acc_stderr": 0.03464881675016341,
"acc_norm": 0.6161616161616161,
"acc_norm_stderr": 0.03464881675016341
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7098445595854922,
"acc_stderr": 0.03275264467791516,
"acc_norm": 0.7098445595854922,
"acc_norm_stderr": 0.03275264467791516
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.441025641025641,
"acc_stderr": 0.025174048384000756,
"acc_norm": 0.441025641025641,
"acc_norm_stderr": 0.025174048384000756
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823018,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.673394495412844,
"acc_stderr": 0.020106990889937303,
"acc_norm": 0.673394495412844,
"acc_norm_stderr": 0.020106990889937303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.03167468706828978,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.03167468706828978
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03308611113236434,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03308611113236434
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6582278481012658,
"acc_stderr": 0.03087453753755362,
"acc_norm": 0.6582278481012658,
"acc_norm_stderr": 0.03087453753755362
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5650224215246636,
"acc_stderr": 0.03327283370271344,
"acc_norm": 0.5650224215246636,
"acc_norm_stderr": 0.03327283370271344
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5460122699386503,
"acc_stderr": 0.0391170190467718,
"acc_norm": 0.5460122699386503,
"acc_norm_stderr": 0.0391170190467718
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.045416094465039476,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.045416094465039476
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7307692307692307,
"acc_stderr": 0.029058588303748842,
"acc_norm": 0.7307692307692307,
"acc_norm_stderr": 0.029058588303748842
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6871008939974457,
"acc_stderr": 0.01658093594030406,
"acc_norm": 0.6871008939974457,
"acc_norm_stderr": 0.01658093594030406
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.02688264343402289,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.02688264343402289
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2122905027932961,
"acc_stderr": 0.013676644685831726,
"acc_norm": 0.2122905027932961,
"acc_norm_stderr": 0.013676644685831726
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5228758169934641,
"acc_stderr": 0.028599936776089782,
"acc_norm": 0.5228758169934641,
"acc_norm_stderr": 0.028599936776089782
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5562700964630225,
"acc_stderr": 0.028217683556652315,
"acc_norm": 0.5562700964630225,
"acc_norm_stderr": 0.028217683556652315
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5524691358024691,
"acc_stderr": 0.0276671385694227,
"acc_norm": 0.5524691358024691,
"acc_norm_stderr": 0.0276671385694227
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.37943262411347517,
"acc_stderr": 0.028947338851614105,
"acc_norm": 0.37943262411347517,
"acc_norm_stderr": 0.028947338851614105
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3455019556714472,
"acc_stderr": 0.012145303004087206,
"acc_norm": 0.3455019556714472,
"acc_norm_stderr": 0.012145303004087206
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03035969707904611,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03035969707904611
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4869281045751634,
"acc_stderr": 0.020220920829626923,
"acc_norm": 0.4869281045751634,
"acc_norm_stderr": 0.020220920829626923
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5795918367346938,
"acc_stderr": 0.03160106993449601,
"acc_norm": 0.5795918367346938,
"acc_norm_stderr": 0.03160106993449601
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6318407960199005,
"acc_stderr": 0.03410410565495302,
"acc_norm": 0.6318407960199005,
"acc_norm_stderr": 0.03410410565495302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.03446296217088427,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.03446296217088427
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024637,
"mc2": 0.43423608772141165,
"mc2_stderr": 0.01468977817324311
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_health360__Healix-410M | 2023-09-18T14:27:05.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of health360/Healix-410M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [health360/Healix-410M](https://huggingface.co/health360/Healix-410M) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_health360__Healix-410M\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T14:25:49.264800](https://huggingface.co/datasets/open-llm-leaderboard/details_health360__Healix-410M/blob/main/results_2023-09-18T14-25-49.264800.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24946544058171313,\n\
\ \"acc_stderr\": 0.031196082232966424,\n \"acc_norm\": 0.25062061084842896,\n\
\ \"acc_norm_stderr\": 0.03121149862269038,\n \"mc1\": 0.2729498164014688,\n\
\ \"mc1_stderr\": 0.01559475363200653,\n \"mc2\": 0.44415007746402513,\n\
\ \"mc2_stderr\": 0.015540787534678682\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.20819112627986347,\n \"acc_stderr\": 0.011864866118448069,\n\
\ \"acc_norm\": 0.2508532423208191,\n \"acc_norm_stderr\": 0.012668198621315432\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.29466241784505076,\n\
\ \"acc_stderr\": 0.0045495914900461915,\n \"acc_norm\": 0.3201553475403306,\n\
\ \"acc_norm_stderr\": 0.004655825980891999\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.037498507091740206,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.037498507091740206\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123408,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123408\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.025288394502891366,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.025288394502891366\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n\
\ \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.18497109826589594,\n\
\ \"acc_stderr\": 0.0296056239817712,\n \"acc_norm\": 0.18497109826589594,\n\
\ \"acc_norm_stderr\": 0.0296056239817712\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087785,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.028020226271200214,\n\
\ \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.028020226271200214\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113942,\n \"\
acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113942\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.16666666666666666,\n\
\ \"acc_stderr\": 0.03333333333333337,\n \"acc_norm\": 0.16666666666666666,\n\
\ \"acc_norm_stderr\": 0.03333333333333337\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.31290322580645163,\n \"acc_stderr\": 0.02637756702864586,\n \"\
acc_norm\": 0.31290322580645163,\n \"acc_norm_stderr\": 0.02637756702864586\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358609,\n \"\
acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358609\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.23737373737373738,\n \"acc_stderr\": 0.030313710538198906,\n \"\
acc_norm\": 0.23737373737373738,\n \"acc_norm_stderr\": 0.030313710538198906\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.25906735751295334,\n \"acc_stderr\": 0.031618779179354094,\n\
\ \"acc_norm\": 0.25906735751295334,\n \"acc_norm_stderr\": 0.031618779179354094\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2512820512820513,\n \"acc_stderr\": 0.021992016662370547,\n\
\ \"acc_norm\": 0.2512820512820513,\n \"acc_norm_stderr\": 0.021992016662370547\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823018,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823018\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.027553614467863804,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.027553614467863804\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587546,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587546\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22018348623853212,\n \"acc_stderr\": 0.01776597865232756,\n \"\
acc_norm\": 0.22018348623853212,\n \"acc_norm_stderr\": 0.01776597865232756\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.22549019607843138,\n \"acc_stderr\": 0.02933116229425174,\n \"\
acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.02933116229425174\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658342,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658342\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3811659192825112,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.3811659192825112,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.03727673575596918,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.03727673575596918\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3067484662576687,\n \"acc_stderr\": 0.036230899157241474,\n\
\ \"acc_norm\": 0.3067484662576687,\n \"acc_norm_stderr\": 0.036230899157241474\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n\
\ \"acc_stderr\": 0.04007341809755806,\n \"acc_norm\": 0.23214285714285715,\n\
\ \"acc_norm_stderr\": 0.04007341809755806\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1553398058252427,\n \"acc_stderr\": 0.035865947385739734,\n\
\ \"acc_norm\": 0.1553398058252427,\n \"acc_norm_stderr\": 0.035865947385739734\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24648786717752236,\n\
\ \"acc_stderr\": 0.015411308769686943,\n \"acc_norm\": 0.24648786717752236,\n\
\ \"acc_norm_stderr\": 0.015411308769686943\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n\
\ \"acc_stderr\": 0.01446589382985994,\n \"acc_norm\": 0.24916201117318434,\n\
\ \"acc_norm_stderr\": 0.01446589382985994\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.025261691219729484,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.025261691219729484\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2282958199356913,\n\
\ \"acc_stderr\": 0.023839303311398215,\n \"acc_norm\": 0.2282958199356913,\n\
\ \"acc_norm_stderr\": 0.023839303311398215\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22839506172839505,\n \"acc_stderr\": 0.023358211840626267,\n\
\ \"acc_norm\": 0.22839506172839505,\n \"acc_norm_stderr\": 0.023358211840626267\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180848,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2320730117340287,\n\
\ \"acc_stderr\": 0.010782046665905182,\n \"acc_norm\": 0.2320730117340287,\n\
\ \"acc_norm_stderr\": 0.010782046665905182\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4338235294117647,\n \"acc_stderr\": 0.03010563657001664,\n\
\ \"acc_norm\": 0.4338235294117647,\n \"acc_norm_stderr\": 0.03010563657001664\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.040139645540727756,\n\
\ \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.040139645540727756\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2612244897959184,\n\
\ \"acc_stderr\": 0.02812342933514278,\n \"acc_norm\": 0.2612244897959184,\n\
\ \"acc_norm_stderr\": 0.02812342933514278\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401466,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401466\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.21686746987951808,\n \"acc_stderr\": 0.03208284450356365,\n\
\ \"acc_norm\": 0.21686746987951808,\n \"acc_norm_stderr\": 0.03208284450356365\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21637426900584794,\n\
\ \"acc_stderr\": 0.03158149539338734,\n \"acc_norm\": 0.21637426900584794,\n\
\ \"acc_norm_stderr\": 0.03158149539338734\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2729498164014688,\n \"mc1_stderr\": 0.01559475363200653,\n\
\ \"mc2\": 0.44415007746402513,\n \"mc2_stderr\": 0.015540787534678682\n\
\ }\n}\n```"
repo_url: https://huggingface.co/health360/Healix-410M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-25-49.264800.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-25-49.264800.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-25-49.264800.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-25-49.264800.parquet'
- config_name: results
data_files:
- split: 2023_09_18T14_25_49.264800
path:
- results_2023-09-18T14-25-49.264800.parquet
- split: latest
path:
- results_2023-09-18T14-25-49.264800.parquet
---
# Dataset Card for Evaluation run of health360/Healix-410M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/health360/Healix-410M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [health360/Healix-410M](https://huggingface.co/health360/Healix-410M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_health360__Healix-410M",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T14:25:49.264800](https://huggingface.co/datasets/open-llm-leaderboard/details_health360__Healix-410M/blob/main/results_2023-09-18T14-25-49.264800.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24946544058171313,
"acc_stderr": 0.031196082232966424,
"acc_norm": 0.25062061084842896,
"acc_norm_stderr": 0.03121149862269038,
"mc1": 0.2729498164014688,
"mc1_stderr": 0.01559475363200653,
"mc2": 0.44415007746402513,
"mc2_stderr": 0.015540787534678682
},
"harness|arc:challenge|25": {
"acc": 0.20819112627986347,
"acc_stderr": 0.011864866118448069,
"acc_norm": 0.2508532423208191,
"acc_norm_stderr": 0.012668198621315432
},
"harness|hellaswag|10": {
"acc": 0.29466241784505076,
"acc_stderr": 0.0045495914900461915,
"acc_norm": 0.3201553475403306,
"acc_norm_stderr": 0.004655825980891999
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.037498507091740206,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.037498507091740206
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123408,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123408
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.025288394502891366,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.025288394502891366
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.18497109826589594,
"acc_stderr": 0.0296056239817712,
"acc_norm": 0.18497109826589594,
"acc_norm_stderr": 0.0296056239817712
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087785,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.028020226271200214,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.028020226271200214
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113942,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113942
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03333333333333337,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03333333333333337
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.31290322580645163,
"acc_stderr": 0.02637756702864586,
"acc_norm": 0.31290322580645163,
"acc_norm_stderr": 0.02637756702864586
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.03161856335358609,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.03161856335358609
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23737373737373738,
"acc_stderr": 0.030313710538198906,
"acc_norm": 0.23737373737373738,
"acc_norm_stderr": 0.030313710538198906
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.25906735751295334,
"acc_stderr": 0.031618779179354094,
"acc_norm": 0.25906735751295334,
"acc_norm_stderr": 0.031618779179354094
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2512820512820513,
"acc_stderr": 0.021992016662370547,
"acc_norm": 0.2512820512820513,
"acc_norm_stderr": 0.021992016662370547
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823018,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.027553614467863804,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.027553614467863804
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587546,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587546
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22018348623853212,
"acc_stderr": 0.01776597865232756,
"acc_norm": 0.22018348623853212,
"acc_norm_stderr": 0.01776597865232756
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3811659192825112,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.3811659192825112,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.03727673575596918,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.03727673575596918
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3067484662576687,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.3067484662576687,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755806,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755806
},
"harness|hendrycksTest-management|5": {
"acc": 0.1553398058252427,
"acc_stderr": 0.035865947385739734,
"acc_norm": 0.1553398058252427,
"acc_norm_stderr": 0.035865947385739734
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24648786717752236,
"acc_stderr": 0.015411308769686943,
"acc_norm": 0.24648786717752236,
"acc_norm_stderr": 0.015411308769686943
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.01446589382985994,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.01446589382985994
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.025261691219729484,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.025261691219729484
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2282958199356913,
"acc_stderr": 0.023839303311398215,
"acc_norm": 0.2282958199356913,
"acc_norm_stderr": 0.023839303311398215
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22839506172839505,
"acc_stderr": 0.023358211840626267,
"acc_norm": 0.22839506172839505,
"acc_norm_stderr": 0.023358211840626267
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180848,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2320730117340287,
"acc_stderr": 0.010782046665905182,
"acc_norm": 0.2320730117340287,
"acc_norm_stderr": 0.010782046665905182
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4338235294117647,
"acc_stderr": 0.03010563657001664,
"acc_norm": 0.4338235294117647,
"acc_norm_stderr": 0.03010563657001664
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.040139645540727756,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.040139645540727756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2612244897959184,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.2612244897959184,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401466,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401466
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21686746987951808,
"acc_stderr": 0.03208284450356365,
"acc_norm": 0.21686746987951808,
"acc_norm_stderr": 0.03208284450356365
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21637426900584794,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.21637426900584794,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2729498164014688,
"mc1_stderr": 0.01559475363200653,
"mc2": 0.44415007746402513,
"mc2_stderr": 0.015540787534678682
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
grostaco/test | 2023-09-18T14:37:29.000Z | [
"task_categories:text-classification",
"language:en",
"sentiment",
"region:us"
] | grostaco | null | null | null | 0 | 0 | ---
task_categories:
- text-classification
language:
- en
tags:
- sentiment
dataset-info:
- config_name: default
features:
- name: content
dtype: string
- name: sentiment
dtype:
class_label:
names:
'0': positive
'1': negative
'2': neutral
- splits:
- name: train
num_examples: 3
--- |
open-llm-leaderboard/details_TinyPixel__testmodel2 | 2023-09-18T14:29:33.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TinyPixel/testmodel2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TinyPixel/testmodel2](https://huggingface.co/TinyPixel/testmodel2) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TinyPixel__testmodel2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T14:28:17.558290](https://huggingface.co/datasets/open-llm-leaderboard/details_TinyPixel__testmodel2/blob/main/results_2023-09-18T14-28-17.558290.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.46863022947647875,\n\
\ \"acc_stderr\": 0.03525899737630071,\n \"acc_norm\": 0.4726646011333369,\n\
\ \"acc_norm_stderr\": 0.03524450201413607,\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.0153218216884762,\n \"mc2\": 0.3917355712197997,\n\
\ \"mc2_stderr\": 0.013582107635745794\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49146757679180886,\n \"acc_stderr\": 0.01460926316563219,\n\
\ \"acc_norm\": 0.5324232081911263,\n \"acc_norm_stderr\": 0.014580637569995421\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5907189802828122,\n\
\ \"acc_stderr\": 0.004906962980328293,\n \"acc_norm\": 0.7877912766381199,\n\
\ \"acc_norm_stderr\": 0.0040803622082511695\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3881578947368421,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.3881578947368421,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.44150943396226416,\n \"acc_stderr\": 0.03056159042673184,\n\
\ \"acc_norm\": 0.44150943396226416,\n \"acc_norm_stderr\": 0.03056159042673184\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n\
\ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.4583333333333333,\n\
\ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n\
\ \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.43352601156069365,\n\
\ \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159393,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159393\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577656,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577656\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.040061680838488774,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.040061680838488774\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5096774193548387,\n \"acc_stderr\": 0.02843867799890955,\n \"\
acc_norm\": 0.5096774193548387,\n \"acc_norm_stderr\": 0.02843867799890955\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969566,\n \"\
acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969566\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512567,\n\
\ \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512567\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.48484848484848486,\n \"acc_stderr\": 0.0356071651653106,\n \"\
acc_norm\": 0.48484848484848486,\n \"acc_norm_stderr\": 0.0356071651653106\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7046632124352331,\n \"acc_stderr\": 0.0329229663915514,\n\
\ \"acc_norm\": 0.7046632124352331,\n \"acc_norm_stderr\": 0.0329229663915514\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4512820512820513,\n \"acc_stderr\": 0.025230381238934833,\n\
\ \"acc_norm\": 0.4512820512820513,\n \"acc_norm_stderr\": 0.025230381238934833\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4369747899159664,\n \"acc_stderr\": 0.03221943636566196,\n \
\ \"acc_norm\": 0.4369747899159664,\n \"acc_norm_stderr\": 0.03221943636566196\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6293577981651376,\n \"acc_stderr\": 0.02070745816435298,\n \"\
acc_norm\": 0.6293577981651376,\n \"acc_norm_stderr\": 0.02070745816435298\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.03054674526495318,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.03054674526495318\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5294117647058824,\n \"acc_stderr\": 0.03503235296367992,\n \"\
acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03503235296367992\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.620253164556962,\n \"acc_stderr\": 0.031591887529658504,\n \
\ \"acc_norm\": 0.620253164556962,\n \"acc_norm_stderr\": 0.031591887529658504\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5695067264573991,\n\
\ \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.5695067264573991,\n\
\ \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.549618320610687,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.549618320610687,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6611570247933884,\n \"acc_stderr\": 0.043207678075366705,\n \"\
acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.043207678075366705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5153374233128835,\n \"acc_stderr\": 0.03926522378708843,\n\
\ \"acc_norm\": 0.5153374233128835,\n \"acc_norm_stderr\": 0.03926522378708843\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5631067961165048,\n \"acc_stderr\": 0.049111471073657764,\n\
\ \"acc_norm\": 0.5631067961165048,\n \"acc_norm_stderr\": 0.049111471073657764\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7051282051282052,\n\
\ \"acc_stderr\": 0.02987257770889119,\n \"acc_norm\": 0.7051282051282052,\n\
\ \"acc_norm_stderr\": 0.02987257770889119\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6411238825031929,\n\
\ \"acc_stderr\": 0.017152991797501342,\n \"acc_norm\": 0.6411238825031929,\n\
\ \"acc_norm_stderr\": 0.017152991797501342\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.48265895953757226,\n \"acc_stderr\": 0.026902900458666647,\n\
\ \"acc_norm\": 0.48265895953757226,\n \"acc_norm_stderr\": 0.026902900458666647\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4934640522875817,\n \"acc_stderr\": 0.028627470550556047,\n\
\ \"acc_norm\": 0.4934640522875817,\n \"acc_norm_stderr\": 0.028627470550556047\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.594855305466238,\n\
\ \"acc_stderr\": 0.027882383791325953,\n \"acc_norm\": 0.594855305466238,\n\
\ \"acc_norm_stderr\": 0.027882383791325953\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.49691358024691357,\n \"acc_stderr\": 0.027820214158594384,\n\
\ \"acc_norm\": 0.49691358024691357,\n \"acc_norm_stderr\": 0.027820214158594384\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36524822695035464,\n \"acc_stderr\": 0.028723863853281278,\n \
\ \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.028723863853281278\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.36114732724902215,\n\
\ \"acc_stderr\": 0.01226793547751903,\n \"acc_norm\": 0.36114732724902215,\n\
\ \"acc_norm_stderr\": 0.01226793547751903\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.44607843137254904,\n \"acc_stderr\": 0.020109864547181357,\n \
\ \"acc_norm\": 0.44607843137254904,\n \"acc_norm_stderr\": 0.020109864547181357\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.509090909090909,\n\
\ \"acc_stderr\": 0.0478833976870286,\n \"acc_norm\": 0.509090909090909,\n\
\ \"acc_norm_stderr\": 0.0478833976870286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4816326530612245,\n \"acc_stderr\": 0.031987615467631264,\n\
\ \"acc_norm\": 0.4816326530612245,\n \"acc_norm_stderr\": 0.031987615467631264\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n\
\ \"acc_stderr\": 0.03410410565495301,\n \"acc_norm\": 0.6318407960199005,\n\
\ \"acc_norm_stderr\": 0.03410410565495301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079022,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079022\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.03488647713457922,\n\
\ \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.03488647713457922\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.0153218216884762,\n \"mc2\": 0.3917355712197997,\n\
\ \"mc2_stderr\": 0.013582107635745794\n }\n}\n```"
repo_url: https://huggingface.co/TinyPixel/testmodel2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-28-17.558290.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-28-17.558290.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-28-17.558290.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-28-17.558290.parquet'
- config_name: results
data_files:
- split: 2023_09_18T14_28_17.558290
path:
- results_2023-09-18T14-28-17.558290.parquet
- split: latest
path:
- results_2023-09-18T14-28-17.558290.parquet
---
# Dataset Card for Evaluation run of TinyPixel/testmodel2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TinyPixel/testmodel2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TinyPixel/testmodel2](https://huggingface.co/TinyPixel/testmodel2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TinyPixel__testmodel2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T14:28:17.558290](https://huggingface.co/datasets/open-llm-leaderboard/details_TinyPixel__testmodel2/blob/main/results_2023-09-18T14-28-17.558290.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.46863022947647875,
"acc_stderr": 0.03525899737630071,
"acc_norm": 0.4726646011333369,
"acc_norm_stderr": 0.03524450201413607,
"mc1": 0.2582619339045288,
"mc1_stderr": 0.0153218216884762,
"mc2": 0.3917355712197997,
"mc2_stderr": 0.013582107635745794
},
"harness|arc:challenge|25": {
"acc": 0.49146757679180886,
"acc_stderr": 0.01460926316563219,
"acc_norm": 0.5324232081911263,
"acc_norm_stderr": 0.014580637569995421
},
"harness|hellaswag|10": {
"acc": 0.5907189802828122,
"acc_stderr": 0.004906962980328293,
"acc_norm": 0.7877912766381199,
"acc_norm_stderr": 0.0040803622082511695
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3881578947368421,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.3881578947368421,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44150943396226416,
"acc_stderr": 0.03056159042673184,
"acc_norm": 0.44150943396226416,
"acc_norm_stderr": 0.03056159042673184
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159393,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159393
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577656,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577656
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.040061680838488774,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.040061680838488774
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5096774193548387,
"acc_stderr": 0.02843867799890955,
"acc_norm": 0.5096774193548387,
"acc_norm_stderr": 0.02843867799890955
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3497536945812808,
"acc_stderr": 0.03355400904969566,
"acc_norm": 0.3497536945812808,
"acc_norm_stderr": 0.03355400904969566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.03769430314512567,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.03769430314512567
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.48484848484848486,
"acc_stderr": 0.0356071651653106,
"acc_norm": 0.48484848484848486,
"acc_norm_stderr": 0.0356071651653106
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7046632124352331,
"acc_stderr": 0.0329229663915514,
"acc_norm": 0.7046632124352331,
"acc_norm_stderr": 0.0329229663915514
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4512820512820513,
"acc_stderr": 0.025230381238934833,
"acc_norm": 0.4512820512820513,
"acc_norm_stderr": 0.025230381238934833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4369747899159664,
"acc_stderr": 0.03221943636566196,
"acc_norm": 0.4369747899159664,
"acc_norm_stderr": 0.03221943636566196
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6293577981651376,
"acc_stderr": 0.02070745816435298,
"acc_norm": 0.6293577981651376,
"acc_norm_stderr": 0.02070745816435298
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.03054674526495318,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.03054674526495318
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03503235296367992,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03503235296367992
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.620253164556962,
"acc_stderr": 0.031591887529658504,
"acc_norm": 0.620253164556962,
"acc_norm_stderr": 0.031591887529658504
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5695067264573991,
"acc_stderr": 0.033231973029429394,
"acc_norm": 0.5695067264573991,
"acc_norm_stderr": 0.033231973029429394
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.549618320610687,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.549618320610687,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.043207678075366705,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.043207678075366705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5153374233128835,
"acc_stderr": 0.03926522378708843,
"acc_norm": 0.5153374233128835,
"acc_norm_stderr": 0.03926522378708843
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.5631067961165048,
"acc_stderr": 0.049111471073657764,
"acc_norm": 0.5631067961165048,
"acc_norm_stderr": 0.049111471073657764
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7051282051282052,
"acc_stderr": 0.02987257770889119,
"acc_norm": 0.7051282051282052,
"acc_norm_stderr": 0.02987257770889119
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6411238825031929,
"acc_stderr": 0.017152991797501342,
"acc_norm": 0.6411238825031929,
"acc_norm_stderr": 0.017152991797501342
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.48265895953757226,
"acc_stderr": 0.026902900458666647,
"acc_norm": 0.48265895953757226,
"acc_norm_stderr": 0.026902900458666647
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4934640522875817,
"acc_stderr": 0.028627470550556047,
"acc_norm": 0.4934640522875817,
"acc_norm_stderr": 0.028627470550556047
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.594855305466238,
"acc_stderr": 0.027882383791325953,
"acc_norm": 0.594855305466238,
"acc_norm_stderr": 0.027882383791325953
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.49691358024691357,
"acc_stderr": 0.027820214158594384,
"acc_norm": 0.49691358024691357,
"acc_norm_stderr": 0.027820214158594384
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.028723863853281278,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.028723863853281278
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.36114732724902215,
"acc_stderr": 0.01226793547751903,
"acc_norm": 0.36114732724902215,
"acc_norm_stderr": 0.01226793547751903
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.44607843137254904,
"acc_stderr": 0.020109864547181357,
"acc_norm": 0.44607843137254904,
"acc_norm_stderr": 0.020109864547181357
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.0478833976870286,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.0478833976870286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4816326530612245,
"acc_stderr": 0.031987615467631264,
"acc_norm": 0.4816326530612245,
"acc_norm_stderr": 0.031987615467631264
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6318407960199005,
"acc_stderr": 0.03410410565495301,
"acc_norm": 0.6318407960199005,
"acc_norm_stderr": 0.03410410565495301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079022,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079022
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7076023391812866,
"acc_stderr": 0.03488647713457922,
"acc_norm": 0.7076023391812866,
"acc_norm_stderr": 0.03488647713457922
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2582619339045288,
"mc1_stderr": 0.0153218216884762,
"mc2": 0.3917355712197997,
"mc2_stderr": 0.013582107635745794
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_euclaise__falcon_1b_stage2 | 2023-10-03T15:02:46.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of euclaise/falcon_1b_stage2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [euclaise/falcon_1b_stage2](https://huggingface.co/euclaise/falcon_1b_stage2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_euclaise__falcon_1b_stage2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T15:01:26.920880](https://huggingface.co/datasets/open-llm-leaderboard/details_euclaise__falcon_1b_stage2/blob/main/results_2023-10-03T15-01-26.920880.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24715845782563986,\n\
\ \"acc_stderr\": 0.03130699017472766,\n \"acc_norm\": 0.25035482080287663,\n\
\ \"acc_norm_stderr\": 0.03130843554528131,\n \"mc1\": 0.21909424724602203,\n\
\ \"mc1_stderr\": 0.014480038578757447,\n \"mc2\": 0.3839665638374818,\n\
\ \"mc2_stderr\": 0.013753035475597343\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.30887372013651876,\n \"acc_stderr\": 0.013501770929344003,\n\
\ \"acc_norm\": 0.3310580204778157,\n \"acc_norm_stderr\": 0.013752062419817827\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.46554471220872334,\n\
\ \"acc_stderr\": 0.0049779199068753655,\n \"acc_norm\": 0.6319458275243975,\n\
\ \"acc_norm_stderr\": 0.004812905279066442\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313141,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313141\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.22264150943396227,\n \"acc_stderr\": 0.025604233470899098,\n\
\ \"acc_norm\": 0.22264150943396227,\n \"acc_norm_stderr\": 0.025604233470899098\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n\
\ \"acc_stderr\": 0.03214737302029469,\n \"acc_norm\": 0.23121387283236994,\n\
\ \"acc_norm_stderr\": 0.03214737302029469\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2680851063829787,\n \"acc_stderr\": 0.028957342788342347,\n\
\ \"acc_norm\": 0.2680851063829787,\n \"acc_norm_stderr\": 0.028957342788342347\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.03835153954399421,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.03835153954399421\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.21379310344827587,\n \"acc_stderr\": 0.034165204477475494,\n\
\ \"acc_norm\": 0.21379310344827587,\n \"acc_norm_stderr\": 0.034165204477475494\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23544973544973544,\n \"acc_stderr\": 0.02185150982203172,\n \"\
acc_norm\": 0.23544973544973544,\n \"acc_norm_stderr\": 0.02185150982203172\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.040406101782088394,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.040406101782088394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.18387096774193548,\n \"acc_stderr\": 0.02203721734026784,\n \"\
acc_norm\": 0.18387096774193548,\n \"acc_norm_stderr\": 0.02203721734026784\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.1921182266009852,\n \"acc_stderr\": 0.02771931570961478,\n \"\
acc_norm\": 0.1921182266009852,\n \"acc_norm_stderr\": 0.02771931570961478\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139404,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139404\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21212121212121213,\n \"acc_stderr\": 0.02912652283458682,\n \"\
acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.02912652283458682\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19170984455958548,\n \"acc_stderr\": 0.028408953626245296,\n\
\ \"acc_norm\": 0.19170984455958548,\n \"acc_norm_stderr\": 0.028408953626245296\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2076923076923077,\n \"acc_stderr\": 0.020567539567246808,\n\
\ \"acc_norm\": 0.2076923076923077,\n \"acc_norm_stderr\": 0.020567539567246808\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22592592592592592,\n \"acc_stderr\": 0.02549753263960954,\n \
\ \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.02549753263960954\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.02684151432295893,\n \
\ \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.02684151432295893\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.035118075718047245,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.035118075718047245\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22385321100917432,\n \"acc_stderr\": 0.017871217767790232,\n \"\
acc_norm\": 0.22385321100917432,\n \"acc_norm_stderr\": 0.017871217767790232\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.19907407407407407,\n \"acc_stderr\": 0.027232298462690235,\n \"\
acc_norm\": 0.19907407407407407,\n \"acc_norm_stderr\": 0.027232298462690235\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.27848101265822783,\n \"acc_stderr\": 0.02917868230484255,\n\
\ \"acc_norm\": 0.27848101265822783,\n \"acc_norm_stderr\": 0.02917868230484255\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.29596412556053814,\n\
\ \"acc_stderr\": 0.030636591348699817,\n \"acc_norm\": 0.29596412556053814,\n\
\ \"acc_norm_stderr\": 0.030636591348699817\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.03727673575596918,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.03727673575596918\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2037037037037037,\n\
\ \"acc_stderr\": 0.03893542518824846,\n \"acc_norm\": 0.2037037037037037,\n\
\ \"acc_norm_stderr\": 0.03893542518824846\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578729,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578729\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n\
\ \"acc_stderr\": 0.029614323690456648,\n \"acc_norm\": 0.2863247863247863,\n\
\ \"acc_norm_stderr\": 0.029614323690456648\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2388250319284802,\n\
\ \"acc_stderr\": 0.015246803197398691,\n \"acc_norm\": 0.2388250319284802,\n\
\ \"acc_norm_stderr\": 0.015246803197398691\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2514450867052023,\n \"acc_stderr\": 0.02335736578587404,\n\
\ \"acc_norm\": 0.2514450867052023,\n \"acc_norm_stderr\": 0.02335736578587404\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2346368715083799,\n\
\ \"acc_stderr\": 0.014173044098303656,\n \"acc_norm\": 0.2346368715083799,\n\
\ \"acc_norm_stderr\": 0.014173044098303656\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023805186524888156,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023805186524888156\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1832797427652733,\n\
\ \"acc_stderr\": 0.021974198848265812,\n \"acc_norm\": 0.1832797427652733,\n\
\ \"acc_norm_stderr\": 0.021974198848265812\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023132376234543325,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023132376234543325\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23049645390070922,\n \"acc_stderr\": 0.025123739226872412,\n \
\ \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.025123739226872412\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23533246414602346,\n\
\ \"acc_stderr\": 0.010834432543912231,\n \"acc_norm\": 0.23533246414602346,\n\
\ \"acc_norm_stderr\": 0.010834432543912231\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2757352941176471,\n \"acc_stderr\": 0.027146271936625166,\n\
\ \"acc_norm\": 0.2757352941176471,\n \"acc_norm_stderr\": 0.027146271936625166\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24509803921568626,\n \"acc_stderr\": 0.01740181671142766,\n \
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.01740181671142766\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2636363636363636,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.2636363636363636,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.025409301953225678,\n\
\ \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.025409301953225678\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.27710843373493976,\n\
\ \"acc_stderr\": 0.034843315926805875,\n \"acc_norm\": 0.27710843373493976,\n\
\ \"acc_norm_stderr\": 0.034843315926805875\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3508771929824561,\n \"acc_stderr\": 0.036602988340491624,\n\
\ \"acc_norm\": 0.3508771929824561,\n \"acc_norm_stderr\": 0.036602988340491624\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21909424724602203,\n\
\ \"mc1_stderr\": 0.014480038578757447,\n \"mc2\": 0.3839665638374818,\n\
\ \"mc2_stderr\": 0.013753035475597343\n }\n}\n```"
repo_url: https://huggingface.co/euclaise/falcon_1b_stage2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|arc:challenge|25_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hellaswag|10_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-33-29.155732.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T15-01-26.920880.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T15-01-26.920880.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-33-29.155732.parquet'
- split: 2023_10_03T15_01_26.920880
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T15-01-26.920880.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T15-01-26.920880.parquet'
- config_name: results
data_files:
- split: 2023_09_18T14_33_29.155732
path:
- results_2023-09-18T14-33-29.155732.parquet
- split: 2023_10_03T15_01_26.920880
path:
- results_2023-10-03T15-01-26.920880.parquet
- split: latest
path:
- results_2023-10-03T15-01-26.920880.parquet
---
# Dataset Card for Evaluation run of euclaise/falcon_1b_stage2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/euclaise/falcon_1b_stage2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [euclaise/falcon_1b_stage2](https://huggingface.co/euclaise/falcon_1b_stage2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_euclaise__falcon_1b_stage2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T15:01:26.920880](https://huggingface.co/datasets/open-llm-leaderboard/details_euclaise__falcon_1b_stage2/blob/main/results_2023-10-03T15-01-26.920880.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24715845782563986,
"acc_stderr": 0.03130699017472766,
"acc_norm": 0.25035482080287663,
"acc_norm_stderr": 0.03130843554528131,
"mc1": 0.21909424724602203,
"mc1_stderr": 0.014480038578757447,
"mc2": 0.3839665638374818,
"mc2_stderr": 0.013753035475597343
},
"harness|arc:challenge|25": {
"acc": 0.30887372013651876,
"acc_stderr": 0.013501770929344003,
"acc_norm": 0.3310580204778157,
"acc_norm_stderr": 0.013752062419817827
},
"harness|hellaswag|10": {
"acc": 0.46554471220872334,
"acc_stderr": 0.0049779199068753655,
"acc_norm": 0.6319458275243975,
"acc_norm_stderr": 0.004812905279066442
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313141,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313141
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22264150943396227,
"acc_stderr": 0.025604233470899098,
"acc_norm": 0.22264150943396227,
"acc_norm_stderr": 0.025604233470899098
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.03214737302029469,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.03214737302029469
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2680851063829787,
"acc_stderr": 0.028957342788342347,
"acc_norm": 0.2680851063829787,
"acc_norm_stderr": 0.028957342788342347
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03835153954399421,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03835153954399421
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.21379310344827587,
"acc_stderr": 0.034165204477475494,
"acc_norm": 0.21379310344827587,
"acc_norm_stderr": 0.034165204477475494
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23544973544973544,
"acc_stderr": 0.02185150982203172,
"acc_norm": 0.23544973544973544,
"acc_norm_stderr": 0.02185150982203172
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.040406101782088394,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.040406101782088394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.18387096774193548,
"acc_stderr": 0.02203721734026784,
"acc_norm": 0.18387096774193548,
"acc_norm_stderr": 0.02203721734026784
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.1921182266009852,
"acc_stderr": 0.02771931570961478,
"acc_norm": 0.1921182266009852,
"acc_norm_stderr": 0.02771931570961478
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.02912652283458682,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.02912652283458682
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19170984455958548,
"acc_stderr": 0.028408953626245296,
"acc_norm": 0.19170984455958548,
"acc_norm_stderr": 0.028408953626245296
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2076923076923077,
"acc_stderr": 0.020567539567246808,
"acc_norm": 0.2076923076923077,
"acc_norm_stderr": 0.020567539567246808
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.02549753263960954,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.02549753263960954
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.02684151432295893,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.02684151432295893
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.035118075718047245,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.035118075718047245
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22385321100917432,
"acc_stderr": 0.017871217767790232,
"acc_norm": 0.22385321100917432,
"acc_norm_stderr": 0.017871217767790232
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.19907407407407407,
"acc_stderr": 0.027232298462690235,
"acc_norm": 0.19907407407407407,
"acc_norm_stderr": 0.027232298462690235
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.27848101265822783,
"acc_stderr": 0.02917868230484255,
"acc_norm": 0.27848101265822783,
"acc_norm_stderr": 0.02917868230484255
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.29596412556053814,
"acc_stderr": 0.030636591348699817,
"acc_norm": 0.29596412556053814,
"acc_norm_stderr": 0.030636591348699817
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.03727673575596918,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.03727673575596918
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.03893542518824846,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.03893542518824846
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578729,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578729
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2863247863247863,
"acc_stderr": 0.029614323690456648,
"acc_norm": 0.2863247863247863,
"acc_norm_stderr": 0.029614323690456648
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2388250319284802,
"acc_stderr": 0.015246803197398691,
"acc_norm": 0.2388250319284802,
"acc_norm_stderr": 0.015246803197398691
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2514450867052023,
"acc_stderr": 0.02335736578587404,
"acc_norm": 0.2514450867052023,
"acc_norm_stderr": 0.02335736578587404
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2346368715083799,
"acc_stderr": 0.014173044098303656,
"acc_norm": 0.2346368715083799,
"acc_norm_stderr": 0.014173044098303656
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023805186524888156,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023805186524888156
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1832797427652733,
"acc_stderr": 0.021974198848265812,
"acc_norm": 0.1832797427652733,
"acc_norm_stderr": 0.021974198848265812
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023132376234543325,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023132376234543325
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23049645390070922,
"acc_stderr": 0.025123739226872412,
"acc_norm": 0.23049645390070922,
"acc_norm_stderr": 0.025123739226872412
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23533246414602346,
"acc_stderr": 0.010834432543912231,
"acc_norm": 0.23533246414602346,
"acc_norm_stderr": 0.010834432543912231
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2757352941176471,
"acc_stderr": 0.027146271936625166,
"acc_norm": 0.2757352941176471,
"acc_norm_stderr": 0.027146271936625166
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.01740181671142766,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.01740181671142766
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2636363636363636,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.2636363636363636,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19591836734693877,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.19591836734693877,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.27710843373493976,
"acc_stderr": 0.034843315926805875,
"acc_norm": 0.27710843373493976,
"acc_norm_stderr": 0.034843315926805875
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.036602988340491624,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.036602988340491624
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21909424724602203,
"mc1_stderr": 0.014480038578757447,
"mc2": 0.3839665638374818,
"mc2_stderr": 0.013753035475597343
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
andyP/fake_news_en_opensources | 2023-09-18T15:09:39.000Z | [
"license:apache-2.0",
"region:us"
] | andyP | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_Rallio67__3B-redpajama-conditional-alpha | 2023-09-18T14:38:13.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Rallio67/3B-redpajama-conditional-alpha
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Rallio67/3B-redpajama-conditional-alpha](https://huggingface.co/Rallio67/3B-redpajama-conditional-alpha)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Rallio67__3B-redpajama-conditional-alpha\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T14:36:57.601576](https://huggingface.co/datasets/open-llm-leaderboard/details_Rallio67__3B-redpajama-conditional-alpha/blob/main/results_2023-09-18T14-36-57.601576.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25865717924050413,\n\
\ \"acc_stderr\": 0.031653334974453044,\n \"acc_norm\": 0.26221002209486993,\n\
\ \"acc_norm_stderr\": 0.031658525135534556,\n \"mc1\": 0.20563035495716034,\n\
\ \"mc1_stderr\": 0.014148482219460972,\n \"mc2\": 0.36312325124908573,\n\
\ \"mc2_stderr\": 0.01357844144939031\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3191126279863481,\n \"acc_stderr\": 0.013621696119173307,\n\
\ \"acc_norm\": 0.3626279863481229,\n \"acc_norm_stderr\": 0.014049106564955016\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.45289782911770565,\n\
\ \"acc_stderr\": 0.004967591267557404,\n \"acc_norm\": 0.6190001991635132,\n\
\ \"acc_norm_stderr\": 0.004846400325585235\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34074074074074073,\n\
\ \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.34074074074074073,\n\
\ \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.27631578947368424,\n \"acc_stderr\": 0.03639057569952924,\n\
\ \"acc_norm\": 0.27631578947368424,\n \"acc_norm_stderr\": 0.03639057569952924\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24150943396226415,\n \"acc_stderr\": 0.026341480371118362,\n\
\ \"acc_norm\": 0.24150943396226415,\n \"acc_norm_stderr\": 0.026341480371118362\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03708284662416544,\n\
\ \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03708284662416544\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.20851063829787234,\n \"acc_stderr\": 0.026556982117838725,\n\
\ \"acc_norm\": 0.20851063829787234,\n \"acc_norm_stderr\": 0.026556982117838725\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.038351539543994194,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.038351539543994194\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.037245636197746325,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.037245636197746325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577656,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577656\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n\
\ \"acc_stderr\": 0.03395490020856113,\n \"acc_norm\": 0.1746031746031746,\n\
\ \"acc_norm_stderr\": 0.03395490020856113\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23225806451612904,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.23225806451612904,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.02989611429173355,\n\
\ \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.02989611429173355\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.30808080808080807,\n \"acc_stderr\": 0.03289477330098617,\n \"\
acc_norm\": 0.30808080808080807,\n \"acc_norm_stderr\": 0.03289477330098617\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178274,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178274\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.19743589743589743,\n \"acc_stderr\": 0.02018264696867483,\n\
\ \"acc_norm\": 0.19743589743589743,\n \"acc_norm_stderr\": 0.02018264696867483\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230182,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230182\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.027205371538279496,\n \
\ \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.027205371538279496\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.23119266055045873,\n\
\ \"acc_stderr\": 0.01807575024163316,\n \"acc_norm\": 0.23119266055045873,\n\
\ \"acc_norm_stderr\": 0.01807575024163316\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.02915752218460561,\n\
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.02915752218460561\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2647058823529412,\n \"acc_stderr\": 0.03096451792692341,\n \"\
acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.03096451792692341\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658335,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658335\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.273542600896861,\n\
\ \"acc_stderr\": 0.029918586707798824,\n \"acc_norm\": 0.273542600896861,\n\
\ \"acc_norm_stderr\": 0.029918586707798824\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.1984732824427481,\n \"acc_stderr\": 0.03498149385462471,\n\
\ \"acc_norm\": 0.1984732824427481,\n \"acc_norm_stderr\": 0.03498149385462471\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.39669421487603307,\n \"acc_stderr\": 0.04465869780531009,\n \"\
acc_norm\": 0.39669421487603307,\n \"acc_norm_stderr\": 0.04465869780531009\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.03559039531617342,\n\
\ \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.03559039531617342\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n\
\ \"acc_stderr\": 0.04007341809755805,\n \"acc_norm\": 0.23214285714285715,\n\
\ \"acc_norm_stderr\": 0.04007341809755805\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2264957264957265,\n\
\ \"acc_stderr\": 0.0274210072953929,\n \"acc_norm\": 0.2264957264957265,\n\
\ \"acc_norm_stderr\": 0.0274210072953929\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.015671006009339572,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.015671006009339572\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.31213872832369943,\n \"acc_stderr\": 0.024946792225272314,\n\
\ \"acc_norm\": 0.31213872832369943,\n \"acc_norm_stderr\": 0.024946792225272314\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n\
\ \"acc_stderr\": 0.014378169884098414,\n \"acc_norm\": 0.2446927374301676,\n\
\ \"acc_norm_stderr\": 0.014378169884098414\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.31511254019292606,\n\
\ \"acc_stderr\": 0.02638527370346448,\n \"acc_norm\": 0.31511254019292606,\n\
\ \"acc_norm_stderr\": 0.02638527370346448\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2716049382716049,\n \"acc_stderr\": 0.024748624490537375,\n\
\ \"acc_norm\": 0.2716049382716049,\n \"acc_norm_stderr\": 0.024748624490537375\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843014,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843014\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25554106910039115,\n\
\ \"acc_stderr\": 0.011139857833598511,\n \"acc_norm\": 0.25554106910039115,\n\
\ \"acc_norm_stderr\": 0.011139857833598511\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.1948529411764706,\n \"acc_stderr\": 0.024060599423487424,\n\
\ \"acc_norm\": 0.1948529411764706,\n \"acc_norm_stderr\": 0.024060599423487424\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2581699346405229,\n \"acc_stderr\": 0.017704531653250075,\n \
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.017704531653250075\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.22448979591836735,\n \"acc_stderr\": 0.026711430555538408,\n\
\ \"acc_norm\": 0.22448979591836735,\n \"acc_norm_stderr\": 0.026711430555538408\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.20481927710843373,\n\
\ \"acc_stderr\": 0.03141784291663926,\n \"acc_norm\": 0.20481927710843373,\n\
\ \"acc_norm_stderr\": 0.03141784291663926\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n\
\ \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.20563035495716034,\n\
\ \"mc1_stderr\": 0.014148482219460972,\n \"mc2\": 0.36312325124908573,\n\
\ \"mc2_stderr\": 0.01357844144939031\n }\n}\n```"
repo_url: https://huggingface.co/Rallio67/3B-redpajama-conditional-alpha
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-36-57.601576.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-36-57.601576.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-36-57.601576.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-36-57.601576.parquet'
- config_name: results
data_files:
- split: 2023_09_18T14_36_57.601576
path:
- results_2023-09-18T14-36-57.601576.parquet
- split: latest
path:
- results_2023-09-18T14-36-57.601576.parquet
---
# Dataset Card for Evaluation run of Rallio67/3B-redpajama-conditional-alpha
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Rallio67/3B-redpajama-conditional-alpha
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Rallio67/3B-redpajama-conditional-alpha](https://huggingface.co/Rallio67/3B-redpajama-conditional-alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Rallio67__3B-redpajama-conditional-alpha",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T14:36:57.601576](https://huggingface.co/datasets/open-llm-leaderboard/details_Rallio67__3B-redpajama-conditional-alpha/blob/main/results_2023-09-18T14-36-57.601576.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25865717924050413,
"acc_stderr": 0.031653334974453044,
"acc_norm": 0.26221002209486993,
"acc_norm_stderr": 0.031658525135534556,
"mc1": 0.20563035495716034,
"mc1_stderr": 0.014148482219460972,
"mc2": 0.36312325124908573,
"mc2_stderr": 0.01357844144939031
},
"harness|arc:challenge|25": {
"acc": 0.3191126279863481,
"acc_stderr": 0.013621696119173307,
"acc_norm": 0.3626279863481229,
"acc_norm_stderr": 0.014049106564955016
},
"harness|hellaswag|10": {
"acc": 0.45289782911770565,
"acc_stderr": 0.004967591267557404,
"acc_norm": 0.6190001991635132,
"acc_norm_stderr": 0.004846400325585235
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.27631578947368424,
"acc_stderr": 0.03639057569952924,
"acc_norm": 0.27631578947368424,
"acc_norm_stderr": 0.03639057569952924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24150943396226415,
"acc_stderr": 0.026341480371118362,
"acc_norm": 0.24150943396226415,
"acc_norm_stderr": 0.026341480371118362
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03708284662416544,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03708284662416544
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20851063829787234,
"acc_stderr": 0.026556982117838725,
"acc_norm": 0.20851063829787234,
"acc_norm_stderr": 0.026556982117838725
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.038351539543994194,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.038351539543994194
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.037245636197746325,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.037245636197746325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577656,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577656
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.03395490020856113,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.03395490020856113
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23225806451612904,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.23225806451612904,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.02989611429173355,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.02989611429173355
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.30808080808080807,
"acc_stderr": 0.03289477330098617,
"acc_norm": 0.30808080808080807,
"acc_norm_stderr": 0.03289477330098617
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178274,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178274
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.19743589743589743,
"acc_stderr": 0.02018264696867483,
"acc_norm": 0.19743589743589743,
"acc_norm_stderr": 0.02018264696867483
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230182,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230182
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.226890756302521,
"acc_stderr": 0.027205371538279496,
"acc_norm": 0.226890756302521,
"acc_norm_stderr": 0.027205371538279496
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23119266055045873,
"acc_stderr": 0.01807575024163316,
"acc_norm": 0.23119266055045873,
"acc_norm_stderr": 0.01807575024163316
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.02915752218460561,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.02915752218460561
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.03096451792692341,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.03096451792692341
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658335,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658335
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.273542600896861,
"acc_stderr": 0.029918586707798824,
"acc_norm": 0.273542600896861,
"acc_norm_stderr": 0.029918586707798824
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.1984732824427481,
"acc_stderr": 0.03498149385462471,
"acc_norm": 0.1984732824427481,
"acc_norm_stderr": 0.03498149385462471
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.39669421487603307,
"acc_stderr": 0.04465869780531009,
"acc_norm": 0.39669421487603307,
"acc_norm_stderr": 0.04465869780531009
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2883435582822086,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.2883435582822086,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755805,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755805
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2264957264957265,
"acc_stderr": 0.0274210072953929,
"acc_norm": 0.2264957264957265,
"acc_norm_stderr": 0.0274210072953929
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.015671006009339572,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.015671006009339572
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.31213872832369943,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.31213872832369943,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2446927374301676,
"acc_stderr": 0.014378169884098414,
"acc_norm": 0.2446927374301676,
"acc_norm_stderr": 0.014378169884098414
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.31511254019292606,
"acc_stderr": 0.02638527370346448,
"acc_norm": 0.31511254019292606,
"acc_norm_stderr": 0.02638527370346448
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2716049382716049,
"acc_stderr": 0.024748624490537375,
"acc_norm": 0.2716049382716049,
"acc_norm_stderr": 0.024748624490537375
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843014,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843014
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25554106910039115,
"acc_stderr": 0.011139857833598511,
"acc_norm": 0.25554106910039115,
"acc_norm_stderr": 0.011139857833598511
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1948529411764706,
"acc_stderr": 0.024060599423487424,
"acc_norm": 0.1948529411764706,
"acc_norm_stderr": 0.024060599423487424
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.017704531653250075,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.017704531653250075
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22448979591836735,
"acc_stderr": 0.026711430555538408,
"acc_norm": 0.22448979591836735,
"acc_norm_stderr": 0.026711430555538408
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-virology|5": {
"acc": 0.20481927710843373,
"acc_stderr": 0.03141784291663926,
"acc_norm": 0.20481927710843373,
"acc_norm_stderr": 0.03141784291663926
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.20563035495716034,
"mc1_stderr": 0.014148482219460972,
"mc2": 0.36312325124908573,
"mc2_stderr": 0.01357844144939031
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_FelixChao__CodeLlama13B-Finetune-v1 | 2023-09-18T14:43:43.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of FelixChao/CodeLlama13B-Finetune-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FelixChao/CodeLlama13B-Finetune-v1](https://huggingface.co/FelixChao/CodeLlama13B-Finetune-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__CodeLlama13B-Finetune-v1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T14:42:15.580779](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__CodeLlama13B-Finetune-v1/blob/main/results_2023-09-18T14-42-15.580779.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4513695262727952,\n\
\ \"acc_stderr\": 0.03544839981113789,\n \"acc_norm\": 0.4547450667099068,\n\
\ \"acc_norm_stderr\": 0.035442735459508413,\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.01578537085839673,\n \"mc2\": 0.4497302161076769,\n\
\ \"mc2_stderr\": 0.01496180464895227\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4402730375426621,\n \"acc_stderr\": 0.014506769524804243,\n\
\ \"acc_norm\": 0.45819112627986347,\n \"acc_norm_stderr\": 0.0145602203087147\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5123481378211512,\n\
\ \"acc_stderr\": 0.004988259530472478,\n \"acc_norm\": 0.6935869348735312,\n\
\ \"acc_norm_stderr\": 0.004600612000422692\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4342105263157895,\n \"acc_stderr\": 0.040335656678483184,\n\
\ \"acc_norm\": 0.4342105263157895,\n \"acc_norm_stderr\": 0.040335656678483184\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4490566037735849,\n \"acc_stderr\": 0.030612730713641092,\n\
\ \"acc_norm\": 0.4490566037735849,\n \"acc_norm_stderr\": 0.030612730713641092\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3680555555555556,\n\
\ \"acc_stderr\": 0.04032999053960719,\n \"acc_norm\": 0.3680555555555556,\n\
\ \"acc_norm_stderr\": 0.04032999053960719\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3930635838150289,\n\
\ \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.3930635838150289,\n\
\ \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383889,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383889\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.03047297336338005,\n\
\ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.03047297336338005\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29894179894179895,\n \"acc_stderr\": 0.0235776047916558,\n \"\
acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.0235776047916558\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.45806451612903226,\n\
\ \"acc_stderr\": 0.02834378725054063,\n \"acc_norm\": 0.45806451612903226,\n\
\ \"acc_norm_stderr\": 0.02834378725054063\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969566,\n\
\ \"acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969566\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.03851716319398395,\n\
\ \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.03851716319398395\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5757575757575758,\n \"acc_stderr\": 0.03521224908841586,\n \"\
acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.03521224908841586\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6062176165803109,\n \"acc_stderr\": 0.0352607709554824,\n\
\ \"acc_norm\": 0.6062176165803109,\n \"acc_norm_stderr\": 0.0352607709554824\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4153846153846154,\n \"acc_stderr\": 0.024985354923102325,\n\
\ \"acc_norm\": 0.4153846153846154,\n \"acc_norm_stderr\": 0.024985354923102325\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114996,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114996\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.03242225027115006,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03242225027115006\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5908256880733945,\n \"acc_stderr\": 0.021080670264433728,\n \"\
acc_norm\": 0.5908256880733945,\n \"acc_norm_stderr\": 0.021080670264433728\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.033723432716530624,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.033723432716530624\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5294117647058824,\n \"acc_stderr\": 0.03503235296367992,\n \"\
acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03503235296367992\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5611814345991561,\n \"acc_stderr\": 0.032302649315470375,\n \
\ \"acc_norm\": 0.5611814345991561,\n \"acc_norm_stderr\": 0.032302649315470375\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4170403587443946,\n\
\ \"acc_stderr\": 0.03309266936071721,\n \"acc_norm\": 0.4170403587443946,\n\
\ \"acc_norm_stderr\": 0.03309266936071721\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4961832061068702,\n \"acc_stderr\": 0.043851623256015534,\n\
\ \"acc_norm\": 0.4961832061068702,\n \"acc_norm_stderr\": 0.043851623256015534\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5950413223140496,\n \"acc_stderr\": 0.04481137755942469,\n \"\
acc_norm\": 0.5950413223140496,\n \"acc_norm_stderr\": 0.04481137755942469\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04833682445228318,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04833682445228318\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5337423312883436,\n \"acc_stderr\": 0.039194155450484096,\n\
\ \"acc_norm\": 0.5337423312883436,\n \"acc_norm_stderr\": 0.039194155450484096\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.0482572933735639,\n\
\ \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.0482572933735639\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7264957264957265,\n\
\ \"acc_stderr\": 0.02920254015343118,\n \"acc_norm\": 0.7264957264957265,\n\
\ \"acc_norm_stderr\": 0.02920254015343118\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5325670498084292,\n\
\ \"acc_stderr\": 0.017841995750520877,\n \"acc_norm\": 0.5325670498084292,\n\
\ \"acc_norm_stderr\": 0.017841995750520877\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.43641618497109824,\n \"acc_stderr\": 0.026700545424943677,\n\
\ \"acc_norm\": 0.43641618497109824,\n \"acc_norm_stderr\": 0.026700545424943677\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2748603351955307,\n\
\ \"acc_stderr\": 0.014931316703220513,\n \"acc_norm\": 0.2748603351955307,\n\
\ \"acc_norm_stderr\": 0.014931316703220513\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.028607893699576063,\n\
\ \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.028607893699576063\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5273311897106109,\n\
\ \"acc_stderr\": 0.02835563356832818,\n \"acc_norm\": 0.5273311897106109,\n\
\ \"acc_norm_stderr\": 0.02835563356832818\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.027586006221607708,\n\
\ \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.027586006221607708\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.34397163120567376,\n \"acc_stderr\": 0.028338017428611317,\n \
\ \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.028338017428611317\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.32073011734028684,\n\
\ \"acc_stderr\": 0.011921199991782627,\n \"acc_norm\": 0.32073011734028684,\n\
\ \"acc_norm_stderr\": 0.011921199991782627\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.39705882352941174,\n \"acc_stderr\": 0.029722152099280058,\n\
\ \"acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.029722152099280058\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3758169934640523,\n \"acc_stderr\": 0.019594021136577464,\n \
\ \"acc_norm\": 0.3758169934640523,\n \"acc_norm_stderr\": 0.019594021136577464\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5510204081632653,\n \"acc_stderr\": 0.03184213866687579,\n\
\ \"acc_norm\": 0.5510204081632653,\n \"acc_norm_stderr\": 0.03184213866687579\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6069651741293532,\n\
\ \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.6069651741293532,\n\
\ \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n\
\ \"acc_stderr\": 0.037998574544796354,\n \"acc_norm\": 0.39156626506024095,\n\
\ \"acc_norm_stderr\": 0.037998574544796354\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5672514619883041,\n \"acc_stderr\": 0.03799978644370607,\n\
\ \"acc_norm\": 0.5672514619883041,\n \"acc_norm_stderr\": 0.03799978644370607\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.01578537085839673,\n \"mc2\": 0.4497302161076769,\n\
\ \"mc2_stderr\": 0.01496180464895227\n }\n}\n```"
repo_url: https://huggingface.co/FelixChao/CodeLlama13B-Finetune-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-42-15.580779.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-42-15.580779.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-42-15.580779.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-42-15.580779.parquet'
- config_name: results
data_files:
- split: 2023_09_18T14_42_15.580779
path:
- results_2023-09-18T14-42-15.580779.parquet
- split: latest
path:
- results_2023-09-18T14-42-15.580779.parquet
---
# Dataset Card for Evaluation run of FelixChao/CodeLlama13B-Finetune-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/FelixChao/CodeLlama13B-Finetune-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [FelixChao/CodeLlama13B-Finetune-v1](https://huggingface.co/FelixChao/CodeLlama13B-Finetune-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__CodeLlama13B-Finetune-v1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T14:42:15.580779](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__CodeLlama13B-Finetune-v1/blob/main/results_2023-09-18T14-42-15.580779.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4513695262727952,
"acc_stderr": 0.03544839981113789,
"acc_norm": 0.4547450667099068,
"acc_norm_stderr": 0.035442735459508413,
"mc1": 0.2839657282741738,
"mc1_stderr": 0.01578537085839673,
"mc2": 0.4497302161076769,
"mc2_stderr": 0.01496180464895227
},
"harness|arc:challenge|25": {
"acc": 0.4402730375426621,
"acc_stderr": 0.014506769524804243,
"acc_norm": 0.45819112627986347,
"acc_norm_stderr": 0.0145602203087147
},
"harness|hellaswag|10": {
"acc": 0.5123481378211512,
"acc_stderr": 0.004988259530472478,
"acc_norm": 0.6935869348735312,
"acc_norm_stderr": 0.004600612000422692
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4342105263157895,
"acc_stderr": 0.040335656678483184,
"acc_norm": 0.4342105263157895,
"acc_norm_stderr": 0.040335656678483184
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4490566037735849,
"acc_stderr": 0.030612730713641092,
"acc_norm": 0.4490566037735849,
"acc_norm_stderr": 0.030612730713641092
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3680555555555556,
"acc_stderr": 0.04032999053960719,
"acc_norm": 0.3680555555555556,
"acc_norm_stderr": 0.04032999053960719
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3930635838150289,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.3930635838150289,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383889,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383889
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.03047297336338005,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.03047297336338005
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29894179894179895,
"acc_stderr": 0.0235776047916558,
"acc_norm": 0.29894179894179895,
"acc_norm_stderr": 0.0235776047916558
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.45806451612903226,
"acc_stderr": 0.02834378725054063,
"acc_norm": 0.45806451612903226,
"acc_norm_stderr": 0.02834378725054063
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3497536945812808,
"acc_stderr": 0.03355400904969566,
"acc_norm": 0.3497536945812808,
"acc_norm_stderr": 0.03355400904969566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.03851716319398395,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.03851716319398395
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.03521224908841586,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.03521224908841586
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6062176165803109,
"acc_stderr": 0.0352607709554824,
"acc_norm": 0.6062176165803109,
"acc_norm_stderr": 0.0352607709554824
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4153846153846154,
"acc_stderr": 0.024985354923102325,
"acc_norm": 0.4153846153846154,
"acc_norm_stderr": 0.024985354923102325
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114996,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114996
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03242225027115006,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03242225027115006
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5908256880733945,
"acc_stderr": 0.021080670264433728,
"acc_norm": 0.5908256880733945,
"acc_norm_stderr": 0.021080670264433728
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03503235296367992,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03503235296367992
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5611814345991561,
"acc_stderr": 0.032302649315470375,
"acc_norm": 0.5611814345991561,
"acc_norm_stderr": 0.032302649315470375
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4170403587443946,
"acc_stderr": 0.03309266936071721,
"acc_norm": 0.4170403587443946,
"acc_norm_stderr": 0.03309266936071721
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4961832061068702,
"acc_stderr": 0.043851623256015534,
"acc_norm": 0.4961832061068702,
"acc_norm_stderr": 0.043851623256015534
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5950413223140496,
"acc_stderr": 0.04481137755942469,
"acc_norm": 0.5950413223140496,
"acc_norm_stderr": 0.04481137755942469
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5,
"acc_stderr": 0.04833682445228318,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04833682445228318
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5337423312883436,
"acc_stderr": 0.039194155450484096,
"acc_norm": 0.5337423312883436,
"acc_norm_stderr": 0.039194155450484096
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.6116504854368932,
"acc_stderr": 0.0482572933735639,
"acc_norm": 0.6116504854368932,
"acc_norm_stderr": 0.0482572933735639
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7264957264957265,
"acc_stderr": 0.02920254015343118,
"acc_norm": 0.7264957264957265,
"acc_norm_stderr": 0.02920254015343118
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5325670498084292,
"acc_stderr": 0.017841995750520877,
"acc_norm": 0.5325670498084292,
"acc_norm_stderr": 0.017841995750520877
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.43641618497109824,
"acc_stderr": 0.026700545424943677,
"acc_norm": 0.43641618497109824,
"acc_norm_stderr": 0.026700545424943677
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2748603351955307,
"acc_stderr": 0.014931316703220513,
"acc_norm": 0.2748603351955307,
"acc_norm_stderr": 0.014931316703220513
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.028607893699576063,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.028607893699576063
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5273311897106109,
"acc_stderr": 0.02835563356832818,
"acc_norm": 0.5273311897106109,
"acc_norm_stderr": 0.02835563356832818
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.027586006221607708,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.027586006221607708
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.34397163120567376,
"acc_stderr": 0.028338017428611317,
"acc_norm": 0.34397163120567376,
"acc_norm_stderr": 0.028338017428611317
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.32073011734028684,
"acc_stderr": 0.011921199991782627,
"acc_norm": 0.32073011734028684,
"acc_norm_stderr": 0.011921199991782627
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.029722152099280058,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.029722152099280058
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3758169934640523,
"acc_stderr": 0.019594021136577464,
"acc_norm": 0.3758169934640523,
"acc_norm_stderr": 0.019594021136577464
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5510204081632653,
"acc_stderr": 0.03184213866687579,
"acc_norm": 0.5510204081632653,
"acc_norm_stderr": 0.03184213866687579
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6069651741293532,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.6069651741293532,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39156626506024095,
"acc_stderr": 0.037998574544796354,
"acc_norm": 0.39156626506024095,
"acc_norm_stderr": 0.037998574544796354
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5672514619883041,
"acc_stderr": 0.03799978644370607,
"acc_norm": 0.5672514619883041,
"acc_norm_stderr": 0.03799978644370607
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2839657282741738,
"mc1_stderr": 0.01578537085839673,
"mc2": 0.4497302161076769,
"mc2_stderr": 0.01496180464895227
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Rallio67__7B-redpajama-conditional-alpha | 2023-09-18T14:46:41.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Rallio67/7B-redpajama-conditional-alpha
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Rallio67/7B-redpajama-conditional-alpha](https://huggingface.co/Rallio67/7B-redpajama-conditional-alpha)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Rallio67__7B-redpajama-conditional-alpha\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T14:45:25.410527](https://huggingface.co/datasets/open-llm-leaderboard/details_Rallio67__7B-redpajama-conditional-alpha/blob/main/results_2023-09-18T14-45-25.410527.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27163217313171956,\n\
\ \"acc_stderr\": 0.03222644381599448,\n \"acc_norm\": 0.27538540648574494,\n\
\ \"acc_norm_stderr\": 0.032223322532186016,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931586,\n \"mc2\": 0.3642195102296637,\n\
\ \"mc2_stderr\": 0.013530642394858749\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3856655290102389,\n \"acc_stderr\": 0.01422425097325717,\n\
\ \"acc_norm\": 0.4257679180887372,\n \"acc_norm_stderr\": 0.014449464278868805\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5177255526787492,\n\
\ \"acc_stderr\": 0.004986644894743123,\n \"acc_norm\": 0.6990639314877515,\n\
\ \"acc_norm_stderr\": 0.004577275844432454\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.03633384414073464,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.03633384414073464\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.28289473684210525,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.28289473684210525,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.28,\n\
\ \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24528301886792453,\n \"acc_stderr\": 0.02648035717989569,\n\
\ \"acc_norm\": 0.24528301886792453,\n \"acc_norm_stderr\": 0.02648035717989569\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2832369942196532,\n\
\ \"acc_stderr\": 0.03435568056047873,\n \"acc_norm\": 0.2832369942196532,\n\
\ \"acc_norm_stderr\": 0.03435568056047873\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.02880998985410297,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.02880998985410297\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.036951833116502325,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.036951833116502325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.023517294335963283,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.023517294335963283\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.03852273364924316,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.03852273364924316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23548387096774193,\n\
\ \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.23548387096774193,\n\
\ \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.0316185633535861,\n\
\ \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.0316185633535861\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.16666666666666666,\n \"acc_stderr\": 0.026552207828215286,\n \"\
acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.026552207828215286\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.02977866303775295,\n\
\ \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.02977866303775295\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.21794871794871795,\n \"acc_stderr\": 0.020932445774463196,\n\
\ \"acc_norm\": 0.21794871794871795,\n \"acc_norm_stderr\": 0.020932445774463196\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.026265024608275882,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.026265024608275882\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360385,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360385\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24587155963302754,\n \"acc_stderr\": 0.018461940968708443,\n \"\
acc_norm\": 0.24587155963302754,\n \"acc_norm_stderr\": 0.018461940968708443\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.20833333333333334,\n \"acc_stderr\": 0.027696910713093936,\n \"\
acc_norm\": 0.20833333333333334,\n \"acc_norm_stderr\": 0.027696910713093936\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23529411764705882,\n \"acc_stderr\": 0.02977177522814563,\n \"\
acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02977177522814563\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.24472573839662448,\n \"acc_stderr\": 0.02798569938703642,\n \
\ \"acc_norm\": 0.24472573839662448,\n \"acc_norm_stderr\": 0.02798569938703642\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.30493273542600896,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.30493273542600896,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847834,\n\
\ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847834\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.19834710743801653,\n \"acc_stderr\": 0.03640118271990944,\n \"\
acc_norm\": 0.19834710743801653,\n \"acc_norm_stderr\": 0.03640118271990944\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2085889570552147,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.2085889570552147,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.041577515398656284,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.041577515398656284\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n\
\ \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n\
\ \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26053639846743293,\n\
\ \"acc_stderr\": 0.015696008563807092,\n \"acc_norm\": 0.26053639846743293,\n\
\ \"acc_norm_stderr\": 0.015696008563807092\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.30346820809248554,\n \"acc_stderr\": 0.024752411960917205,\n\
\ \"acc_norm\": 0.30346820809248554,\n \"acc_norm_stderr\": 0.024752411960917205\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.014288343803925307,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.014288343803925307\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3366013071895425,\n \"acc_stderr\": 0.02705797462449438,\n\
\ \"acc_norm\": 0.3366013071895425,\n \"acc_norm_stderr\": 0.02705797462449438\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.27009646302250806,\n\
\ \"acc_stderr\": 0.025218040373410622,\n \"acc_norm\": 0.27009646302250806,\n\
\ \"acc_norm_stderr\": 0.025218040373410622\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24382716049382716,\n \"acc_stderr\": 0.023891879541959614,\n\
\ \"acc_norm\": 0.24382716049382716,\n \"acc_norm_stderr\": 0.023891879541959614\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843007,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843007\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26988265971316816,\n\
\ \"acc_stderr\": 0.011337381084250397,\n \"acc_norm\": 0.26988265971316816,\n\
\ \"acc_norm_stderr\": 0.011337381084250397\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.19852941176470587,\n \"acc_stderr\": 0.024231013370541093,\n\
\ \"acc_norm\": 0.19852941176470587,\n \"acc_norm_stderr\": 0.024231013370541093\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2761437908496732,\n \"acc_stderr\": 0.018087276935663137,\n \
\ \"acc_norm\": 0.2761437908496732,\n \"acc_norm_stderr\": 0.018087276935663137\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.34545454545454546,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.34545454545454546,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.34285714285714286,\n \"acc_stderr\": 0.03038726291954773,\n\
\ \"acc_norm\": 0.34285714285714286,\n \"acc_norm_stderr\": 0.03038726291954773\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n\
\ \"acc_stderr\": 0.031157150869355547,\n \"acc_norm\": 0.263681592039801,\n\
\ \"acc_norm_stderr\": 0.031157150869355547\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n\
\ \"acc_stderr\": 0.03571609230053481,\n \"acc_norm\": 0.30120481927710846,\n\
\ \"acc_norm_stderr\": 0.03571609230053481\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.36257309941520466,\n \"acc_stderr\": 0.036871306155620606,\n\
\ \"acc_norm\": 0.36257309941520466,\n \"acc_norm_stderr\": 0.036871306155620606\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931586,\n \"mc2\": 0.3642195102296637,\n\
\ \"mc2_stderr\": 0.013530642394858749\n }\n}\n```"
repo_url: https://huggingface.co/Rallio67/7B-redpajama-conditional-alpha
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-45-25.410527.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-45-25.410527.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-45-25.410527.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-45-25.410527.parquet'
- config_name: results
data_files:
- split: 2023_09_18T14_45_25.410527
path:
- results_2023-09-18T14-45-25.410527.parquet
- split: latest
path:
- results_2023-09-18T14-45-25.410527.parquet
---
# Dataset Card for Evaluation run of Rallio67/7B-redpajama-conditional-alpha
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Rallio67/7B-redpajama-conditional-alpha
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Rallio67/7B-redpajama-conditional-alpha](https://huggingface.co/Rallio67/7B-redpajama-conditional-alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Rallio67__7B-redpajama-conditional-alpha",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T14:45:25.410527](https://huggingface.co/datasets/open-llm-leaderboard/details_Rallio67__7B-redpajama-conditional-alpha/blob/main/results_2023-09-18T14-45-25.410527.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.27163217313171956,
"acc_stderr": 0.03222644381599448,
"acc_norm": 0.27538540648574494,
"acc_norm_stderr": 0.032223322532186016,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931586,
"mc2": 0.3642195102296637,
"mc2_stderr": 0.013530642394858749
},
"harness|arc:challenge|25": {
"acc": 0.3856655290102389,
"acc_stderr": 0.01422425097325717,
"acc_norm": 0.4257679180887372,
"acc_norm_stderr": 0.014449464278868805
},
"harness|hellaswag|10": {
"acc": 0.5177255526787492,
"acc_stderr": 0.004986644894743123,
"acc_norm": 0.6990639314877515,
"acc_norm_stderr": 0.004577275844432454
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073464,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073464
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.28289473684210525,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.28289473684210525,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24528301886792453,
"acc_stderr": 0.02648035717989569,
"acc_norm": 0.24528301886792453,
"acc_norm_stderr": 0.02648035717989569
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.03435568056047873,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.03435568056047873
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.02880998985410297,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.02880998985410297
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.023517294335963283,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.023517294335963283
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924316,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23548387096774193,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.23548387096774193,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.0316185633535861,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.0316185633535861
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.026552207828215286,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.026552207828215286
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.02977866303775295,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.02977866303775295
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.020932445774463196,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.020932445774463196
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.026265024608275882,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.026265024608275882
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360385,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360385
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24587155963302754,
"acc_stderr": 0.018461940968708443,
"acc_norm": 0.24587155963302754,
"acc_norm_stderr": 0.018461940968708443
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.027696910713093936,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.027696910713093936
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.02977177522814563,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.02977177522814563
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.24472573839662448,
"acc_stderr": 0.02798569938703642,
"acc_norm": 0.24472573839662448,
"acc_norm_stderr": 0.02798569938703642
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.30493273542600896,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.30493273542600896,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.03915345408847834,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.03915345408847834
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.19834710743801653,
"acc_stderr": 0.03640118271990944,
"acc_norm": 0.19834710743801653,
"acc_norm_stderr": 0.03640118271990944
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2085889570552147,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.2085889570552147,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.041577515398656284,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.041577515398656284
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26053639846743293,
"acc_stderr": 0.015696008563807092,
"acc_norm": 0.26053639846743293,
"acc_norm_stderr": 0.015696008563807092
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.30346820809248554,
"acc_stderr": 0.024752411960917205,
"acc_norm": 0.30346820809248554,
"acc_norm_stderr": 0.024752411960917205
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.014288343803925307,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.014288343803925307
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3366013071895425,
"acc_stderr": 0.02705797462449438,
"acc_norm": 0.3366013071895425,
"acc_norm_stderr": 0.02705797462449438
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.27009646302250806,
"acc_stderr": 0.025218040373410622,
"acc_norm": 0.27009646302250806,
"acc_norm_stderr": 0.025218040373410622
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24382716049382716,
"acc_stderr": 0.023891879541959614,
"acc_norm": 0.24382716049382716,
"acc_norm_stderr": 0.023891879541959614
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843007,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843007
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26988265971316816,
"acc_stderr": 0.011337381084250397,
"acc_norm": 0.26988265971316816,
"acc_norm_stderr": 0.011337381084250397
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19852941176470587,
"acc_stderr": 0.024231013370541093,
"acc_norm": 0.19852941176470587,
"acc_norm_stderr": 0.024231013370541093
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2761437908496732,
"acc_stderr": 0.018087276935663137,
"acc_norm": 0.2761437908496732,
"acc_norm_stderr": 0.018087276935663137
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.34545454545454546,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.34545454545454546,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.34285714285714286,
"acc_stderr": 0.03038726291954773,
"acc_norm": 0.34285714285714286,
"acc_norm_stderr": 0.03038726291954773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.031157150869355547,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.031157150869355547
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.03571609230053481,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.03571609230053481
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.36257309941520466,
"acc_stderr": 0.036871306155620606,
"acc_norm": 0.36257309941520466,
"acc_norm_stderr": 0.036871306155620606
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931586,
"mc2": 0.3642195102296637,
"mc2_stderr": 0.013530642394858749
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_KnutJaegersberg__deacon-3b | 2023-09-18T14:49:04.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of KnutJaegersberg/deacon-3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/deacon-3b](https://huggingface.co/KnutJaegersberg/deacon-3b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__deacon-3b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T14:47:42.541004](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__deacon-3b/blob/main/results_2023-09-18T14-47-42.541004.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2764583742253961,\n\
\ \"acc_stderr\": 0.03225633548007134,\n \"acc_norm\": 0.2801146727723665,\n\
\ \"acc_norm_stderr\": 0.03225684867687274,\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062133,\n \"mc2\": 0.3606875505864814,\n\
\ \"mc2_stderr\": 0.013631445369195945\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.35580204778157,\n \"acc_stderr\": 0.013990571137918758,\n\
\ \"acc_norm\": 0.3967576791808874,\n \"acc_norm_stderr\": 0.014296513020180646\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4894443337980482,\n\
\ \"acc_stderr\": 0.0049886693437869644,\n \"acc_norm\": 0.6642103166699861,\n\
\ \"acc_norm_stderr\": 0.004713006072807721\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3111111111111111,\n\
\ \"acc_stderr\": 0.03999262876617722,\n \"acc_norm\": 0.3111111111111111,\n\
\ \"acc_norm_stderr\": 0.03999262876617722\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2565789473684211,\n \"acc_stderr\": 0.0355418036802569,\n\
\ \"acc_norm\": 0.2565789473684211,\n \"acc_norm_stderr\": 0.0355418036802569\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.19,\n\
\ \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.19,\n \
\ \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2830188679245283,\n \"acc_stderr\": 0.027724236492700904,\n\
\ \"acc_norm\": 0.2830188679245283,\n \"acc_norm_stderr\": 0.027724236492700904\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080342,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080342\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.27167630057803466,\n\
\ \"acc_stderr\": 0.0339175032232166,\n \"acc_norm\": 0.27167630057803466,\n\
\ \"acc_norm_stderr\": 0.0339175032232166\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.0379328118530781,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.0379328118530781\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.33191489361702126,\n \"acc_stderr\": 0.030783736757745647,\n\
\ \"acc_norm\": 0.33191489361702126,\n \"acc_norm_stderr\": 0.030783736757745647\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.037245636197746325,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.037245636197746325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29894179894179895,\n \"acc_stderr\": 0.023577604791655805,\n \"\
acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.023577604791655805\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.03512207412302051,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.03512207412302051\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.267741935483871,\n \"acc_stderr\": 0.025189006660212378,\n \"\
acc_norm\": 0.267741935483871,\n \"acc_norm_stderr\": 0.025189006660212378\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.0317852971064275,\n \"acc_norm\"\
: 0.2857142857142857,\n \"acc_norm_stderr\": 0.0317852971064275\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.0340150671524904,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.0340150671524904\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3282828282828283,\n \"acc_stderr\": 0.03345678422756776,\n \"\
acc_norm\": 0.3282828282828283,\n \"acc_norm_stderr\": 0.03345678422756776\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.26424870466321243,\n \"acc_stderr\": 0.031821550509166484,\n\
\ \"acc_norm\": 0.26424870466321243,\n \"acc_norm_stderr\": 0.031821550509166484\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.31025641025641026,\n \"acc_stderr\": 0.023454674889404295,\n\
\ \"acc_norm\": 0.31025641025641026,\n \"acc_norm_stderr\": 0.023454674889404295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868966,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868966\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.28623853211009176,\n \"acc_stderr\": 0.019379436628919982,\n \"\
acc_norm\": 0.28623853211009176,\n \"acc_norm_stderr\": 0.019379436628919982\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.03214952147802749,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03214952147802749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.27450980392156865,\n \"acc_stderr\": 0.031321798030832904,\n \"\
acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.031321798030832904\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.28270042194092826,\n \"acc_stderr\": 0.029312814153955934,\n \
\ \"acc_norm\": 0.28270042194092826,\n \"acc_norm_stderr\": 0.029312814153955934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2914798206278027,\n\
\ \"acc_stderr\": 0.030500283176545913,\n \"acc_norm\": 0.2914798206278027,\n\
\ \"acc_norm_stderr\": 0.030500283176545913\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.183206106870229,\n \"acc_stderr\": 0.033927709264947335,\n\
\ \"acc_norm\": 0.183206106870229,\n \"acc_norm_stderr\": 0.033927709264947335\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.32231404958677684,\n \"acc_stderr\": 0.042664163633521685,\n \"\
acc_norm\": 0.32231404958677684,\n \"acc_norm_stderr\": 0.042664163633521685\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.1875,\n\
\ \"acc_stderr\": 0.0370468111477387,\n \"acc_norm\": 0.1875,\n \
\ \"acc_norm_stderr\": 0.0370468111477387\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.04453254836326469,\n\
\ \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.04453254836326469\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2264957264957265,\n\
\ \"acc_stderr\": 0.02742100729539292,\n \"acc_norm\": 0.2264957264957265,\n\
\ \"acc_norm_stderr\": 0.02742100729539292\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2988505747126437,\n\
\ \"acc_stderr\": 0.016369256815093124,\n \"acc_norm\": 0.2988505747126437,\n\
\ \"acc_norm_stderr\": 0.016369256815093124\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2832369942196532,\n \"acc_stderr\": 0.024257901705323374,\n\
\ \"acc_norm\": 0.2832369942196532,\n \"acc_norm_stderr\": 0.024257901705323374\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.01428834380392531,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.01428834380392531\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.025261691219729484,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.025261691219729484\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2829581993569132,\n\
\ \"acc_stderr\": 0.025583062489984824,\n \"acc_norm\": 0.2829581993569132,\n\
\ \"acc_norm_stderr\": 0.025583062489984824\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2907801418439716,\n \"acc_stderr\": 0.027090664368353178,\n \
\ \"acc_norm\": 0.2907801418439716,\n \"acc_norm_stderr\": 0.027090664368353178\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2405475880052151,\n\
\ \"acc_stderr\": 0.010916406735478947,\n \"acc_norm\": 0.2405475880052151,\n\
\ \"acc_norm_stderr\": 0.010916406735478947\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3639705882352941,\n \"acc_stderr\": 0.02922719246003203,\n\
\ \"acc_norm\": 0.3639705882352941,\n \"acc_norm_stderr\": 0.02922719246003203\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.27450980392156865,\n \"acc_stderr\": 0.018054027458815198,\n \
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.018054027458815198\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.33636363636363636,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.33636363636363636,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2530612244897959,\n \"acc_stderr\": 0.0278330238713997,\n\
\ \"acc_norm\": 0.2530612244897959,\n \"acc_norm_stderr\": 0.0278330238713997\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22388059701492538,\n\
\ \"acc_stderr\": 0.029475250236017193,\n \"acc_norm\": 0.22388059701492538,\n\
\ \"acc_norm_stderr\": 0.029475250236017193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3373493975903614,\n\
\ \"acc_stderr\": 0.03680783690727581,\n \"acc_norm\": 0.3373493975903614,\n\
\ \"acc_norm_stderr\": 0.03680783690727581\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03615507630310935,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03615507630310935\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062133,\n \"mc2\": 0.3606875505864814,\n\
\ \"mc2_stderr\": 0.013631445369195945\n }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/deacon-3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-47-42.541004.parquet'
- config_name: results
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- results_2023-09-18T14-47-42.541004.parquet
- split: latest
path:
- results_2023-09-18T14-47-42.541004.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/deacon-3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/deacon-3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/deacon-3b](https://huggingface.co/KnutJaegersberg/deacon-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__deacon-3b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T14:47:42.541004](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__deacon-3b/blob/main/results_2023-09-18T14-47-42.541004.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2764583742253961,
"acc_stderr": 0.03225633548007134,
"acc_norm": 0.2801146727723665,
"acc_norm_stderr": 0.03225684867687274,
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062133,
"mc2": 0.3606875505864814,
"mc2_stderr": 0.013631445369195945
},
"harness|arc:challenge|25": {
"acc": 0.35580204778157,
"acc_stderr": 0.013990571137918758,
"acc_norm": 0.3967576791808874,
"acc_norm_stderr": 0.014296513020180646
},
"harness|hellaswag|10": {
"acc": 0.4894443337980482,
"acc_stderr": 0.0049886693437869644,
"acc_norm": 0.6642103166699861,
"acc_norm_stderr": 0.004713006072807721
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.03999262876617722,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.03999262876617722
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2565789473684211,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.2565789473684211,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2830188679245283,
"acc_stderr": 0.027724236492700904,
"acc_norm": 0.2830188679245283,
"acc_norm_stderr": 0.027724236492700904
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080342,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080342
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.27167630057803466,
"acc_stderr": 0.0339175032232166,
"acc_norm": 0.27167630057803466,
"acc_norm_stderr": 0.0339175032232166
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.0379328118530781,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.0379328118530781
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33191489361702126,
"acc_stderr": 0.030783736757745647,
"acc_norm": 0.33191489361702126,
"acc_norm_stderr": 0.030783736757745647
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.037245636197746325,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.037245636197746325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29894179894179895,
"acc_stderr": 0.023577604791655805,
"acc_norm": 0.29894179894179895,
"acc_norm_stderr": 0.023577604791655805
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.03512207412302051,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.03512207412302051
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.267741935483871,
"acc_stderr": 0.025189006660212378,
"acc_norm": 0.267741935483871,
"acc_norm_stderr": 0.025189006660212378
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0317852971064275,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0317852971064275
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.0340150671524904,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.0340150671524904
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3282828282828283,
"acc_stderr": 0.03345678422756776,
"acc_norm": 0.3282828282828283,
"acc_norm_stderr": 0.03345678422756776
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.26424870466321243,
"acc_stderr": 0.031821550509166484,
"acc_norm": 0.26424870466321243,
"acc_norm_stderr": 0.031821550509166484
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.31025641025641026,
"acc_stderr": 0.023454674889404295,
"acc_norm": 0.31025641025641026,
"acc_norm_stderr": 0.023454674889404295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868966,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868966
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.28623853211009176,
"acc_stderr": 0.019379436628919982,
"acc_norm": 0.28623853211009176,
"acc_norm_stderr": 0.019379436628919982
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03214952147802749,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03214952147802749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.031321798030832904,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.031321798030832904
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.28270042194092826,
"acc_stderr": 0.029312814153955934,
"acc_norm": 0.28270042194092826,
"acc_norm_stderr": 0.029312814153955934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2914798206278027,
"acc_stderr": 0.030500283176545913,
"acc_norm": 0.2914798206278027,
"acc_norm_stderr": 0.030500283176545913
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.183206106870229,
"acc_stderr": 0.033927709264947335,
"acc_norm": 0.183206106870229,
"acc_norm_stderr": 0.033927709264947335
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.32231404958677684,
"acc_stderr": 0.042664163633521685,
"acc_norm": 0.32231404958677684,
"acc_norm_stderr": 0.042664163633521685
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.1875,
"acc_stderr": 0.0370468111477387,
"acc_norm": 0.1875,
"acc_norm_stderr": 0.0370468111477387
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.04453254836326469,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.04453254836326469
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2264957264957265,
"acc_stderr": 0.02742100729539292,
"acc_norm": 0.2264957264957265,
"acc_norm_stderr": 0.02742100729539292
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2988505747126437,
"acc_stderr": 0.016369256815093124,
"acc_norm": 0.2988505747126437,
"acc_norm_stderr": 0.016369256815093124
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.024257901705323374,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.024257901705323374
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.01428834380392531,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.01428834380392531
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.025261691219729484,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.025261691219729484
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2829581993569132,
"acc_stderr": 0.025583062489984824,
"acc_norm": 0.2829581993569132,
"acc_norm_stderr": 0.025583062489984824
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2907801418439716,
"acc_stderr": 0.027090664368353178,
"acc_norm": 0.2907801418439716,
"acc_norm_stderr": 0.027090664368353178
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2405475880052151,
"acc_stderr": 0.010916406735478947,
"acc_norm": 0.2405475880052151,
"acc_norm_stderr": 0.010916406735478947
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3639705882352941,
"acc_stderr": 0.02922719246003203,
"acc_norm": 0.3639705882352941,
"acc_norm_stderr": 0.02922719246003203
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.018054027458815198,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.018054027458815198
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.33636363636363636,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.33636363636363636,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2530612244897959,
"acc_stderr": 0.0278330238713997,
"acc_norm": 0.2530612244897959,
"acc_norm_stderr": 0.0278330238713997
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22388059701492538,
"acc_stderr": 0.029475250236017193,
"acc_norm": 0.22388059701492538,
"acc_norm_stderr": 0.029475250236017193
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3373493975903614,
"acc_stderr": 0.03680783690727581,
"acc_norm": 0.3373493975903614,
"acc_norm_stderr": 0.03680783690727581
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03615507630310935,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03615507630310935
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062133,
"mc2": 0.3606875505864814,
"mc2_stderr": 0.013631445369195945
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
alexalbala/my_test_dataset | 2023-09-20T10:24:48.000Z | [
"license:cc",
"region:us"
] | alexalbala | null | null | null | 0 | 0 | ---
license: cc
---
|
open-llm-leaderboard/details_euclaise__falcon_1b_stage1 | 2023-09-18T15:15:38.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of euclaise/falcon_1b_stage1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [euclaise/falcon_1b_stage1](https://huggingface.co/euclaise/falcon_1b_stage1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_euclaise__falcon_1b_stage1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T15:14:21.518286](https://huggingface.co/datasets/open-llm-leaderboard/details_euclaise__falcon_1b_stage1/blob/main/results_2023-09-18T15-14-21.518286.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24968311024212061,\n\
\ \"acc_stderr\": 0.03137758849784344,\n \"acc_norm\": 0.2529298027043813,\n\
\ \"acc_norm_stderr\": 0.0313807620027171,\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.3999657351833433,\n\
\ \"mc2_stderr\": 0.013864802658827929\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3191126279863481,\n \"acc_stderr\": 0.013621696119173302,\n\
\ \"acc_norm\": 0.3515358361774744,\n \"acc_norm_stderr\": 0.013952413699600938\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.46484763991236805,\n\
\ \"acc_stderr\": 0.004977434505403358,\n \"acc_norm\": 0.6239792869946226,\n\
\ \"acc_norm_stderr\": 0.004833953712521772\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2074074074074074,\n\
\ \"acc_stderr\": 0.035025531706783165,\n \"acc_norm\": 0.2074074074074074,\n\
\ \"acc_norm_stderr\": 0.035025531706783165\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19078947368421054,\n \"acc_stderr\": 0.031975658210325,\n\
\ \"acc_norm\": 0.19078947368421054,\n \"acc_norm_stderr\": 0.031975658210325\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653696,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653696\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.02924188386962882,\n\
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.02924188386962882\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.038351539543994194,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.038351539543994194\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n\
\ \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577657,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577657\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.040061680838488774,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.040061680838488774\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.18387096774193548,\n\
\ \"acc_stderr\": 0.022037217340267836,\n \"acc_norm\": 0.18387096774193548,\n\
\ \"acc_norm_stderr\": 0.022037217340267836\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.1724137931034483,\n \"acc_stderr\": 0.026577672183036576,\n\
\ \"acc_norm\": 0.1724137931034483,\n \"acc_norm_stderr\": 0.026577672183036576\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.296969696969697,\n \"acc_stderr\": 0.03567969772268048,\n\
\ \"acc_norm\": 0.296969696969697,\n \"acc_norm_stderr\": 0.03567969772268048\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.20207253886010362,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.20207253886010362,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2153846153846154,\n \"acc_stderr\": 0.020843034557462878,\n\
\ \"acc_norm\": 0.2153846153846154,\n \"acc_norm_stderr\": 0.020843034557462878\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959916,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959916\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.024762902678057926,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.024762902678057926\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.036848815213890246,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.036848815213890246\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.20550458715596331,\n \"acc_stderr\": 0.017324352325016015,\n \"\
acc_norm\": 0.20550458715596331,\n \"acc_norm_stderr\": 0.017324352325016015\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.19444444444444445,\n \"acc_stderr\": 0.026991454502036726,\n \"\
acc_norm\": 0.19444444444444445,\n \"acc_norm_stderr\": 0.026991454502036726\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.28921568627450983,\n \"acc_stderr\": 0.03182231867647553,\n \"\
acc_norm\": 0.28921568627450983,\n \"acc_norm_stderr\": 0.03182231867647553\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2825112107623318,\n\
\ \"acc_stderr\": 0.03021683101150877,\n \"acc_norm\": 0.2825112107623318,\n\
\ \"acc_norm_stderr\": 0.03021683101150877\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2085889570552147,\n \"acc_stderr\": 0.03192193448934723,\n\
\ \"acc_norm\": 0.2085889570552147,\n \"acc_norm_stderr\": 0.03192193448934723\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578729,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578729\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.29914529914529914,\n\
\ \"acc_stderr\": 0.029996951858349497,\n \"acc_norm\": 0.29914529914529914,\n\
\ \"acc_norm_stderr\": 0.029996951858349497\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24776500638569604,\n\
\ \"acc_stderr\": 0.015438083080568973,\n \"acc_norm\": 0.24776500638569604,\n\
\ \"acc_norm_stderr\": 0.015438083080568973\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2630057803468208,\n \"acc_stderr\": 0.023703099525258172,\n\
\ \"acc_norm\": 0.2630057803468208,\n \"acc_norm_stderr\": 0.023703099525258172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.23202614379084968,\n \"acc_stderr\": 0.024170840879341005,\n\
\ \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.024170840879341005\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2797427652733119,\n\
\ \"acc_stderr\": 0.0254942593506949,\n \"acc_norm\": 0.2797427652733119,\n\
\ \"acc_norm_stderr\": 0.0254942593506949\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.023468429832451163,\n\
\ \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.023468429832451163\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.22340425531914893,\n \"acc_stderr\": 0.024847921358063962,\n \
\ \"acc_norm\": 0.22340425531914893,\n \"acc_norm_stderr\": 0.024847921358063962\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2301173402868318,\n\
\ \"acc_stderr\": 0.010750183177375548,\n \"acc_norm\": 0.2301173402868318,\n\
\ \"acc_norm_stderr\": 0.010750183177375548\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.19852941176470587,\n \"acc_stderr\": 0.024231013370541097,\n\
\ \"acc_norm\": 0.19852941176470587,\n \"acc_norm_stderr\": 0.024231013370541097\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26143790849673204,\n \"acc_stderr\": 0.017776947157528034,\n \
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.017776947157528034\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.19090909090909092,\n\
\ \"acc_stderr\": 0.03764425585984926,\n \"acc_norm\": 0.19090909090909092,\n\
\ \"acc_norm_stderr\": 0.03764425585984926\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.21224489795918366,\n \"acc_stderr\": 0.026176967197866764,\n\
\ \"acc_norm\": 0.21224489795918366,\n \"acc_norm_stderr\": 0.026176967197866764\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22885572139303484,\n\
\ \"acc_stderr\": 0.02970528405677244,\n \"acc_norm\": 0.22885572139303484,\n\
\ \"acc_norm_stderr\": 0.02970528405677244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.03460579907553027,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.03460579907553027\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.26900584795321636,\n \"acc_stderr\": 0.03401052620104089,\n\
\ \"acc_norm\": 0.26900584795321636,\n \"acc_norm_stderr\": 0.03401052620104089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.3999657351833433,\n\
\ \"mc2_stderr\": 0.013864802658827929\n }\n}\n```"
repo_url: https://huggingface.co/euclaise/falcon_1b_stage1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|arc:challenge|25_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hellaswag|10_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T15-14-21.518286.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T15-14-21.518286.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T15-14-21.518286.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T15-14-21.518286.parquet'
- config_name: results
data_files:
- split: 2023_09_18T15_14_21.518286
path:
- results_2023-09-18T15-14-21.518286.parquet
- split: latest
path:
- results_2023-09-18T15-14-21.518286.parquet
---
# Dataset Card for Evaluation run of euclaise/falcon_1b_stage1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/euclaise/falcon_1b_stage1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [euclaise/falcon_1b_stage1](https://huggingface.co/euclaise/falcon_1b_stage1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_euclaise__falcon_1b_stage1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T15:14:21.518286](https://huggingface.co/datasets/open-llm-leaderboard/details_euclaise__falcon_1b_stage1/blob/main/results_2023-09-18T15-14-21.518286.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24968311024212061,
"acc_stderr": 0.03137758849784344,
"acc_norm": 0.2529298027043813,
"acc_norm_stderr": 0.0313807620027171,
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.3999657351833433,
"mc2_stderr": 0.013864802658827929
},
"harness|arc:challenge|25": {
"acc": 0.3191126279863481,
"acc_stderr": 0.013621696119173302,
"acc_norm": 0.3515358361774744,
"acc_norm_stderr": 0.013952413699600938
},
"harness|hellaswag|10": {
"acc": 0.46484763991236805,
"acc_stderr": 0.004977434505403358,
"acc_norm": 0.6239792869946226,
"acc_norm_stderr": 0.004833953712521772
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2074074074074074,
"acc_stderr": 0.035025531706783165,
"acc_norm": 0.2074074074074074,
"acc_norm_stderr": 0.035025531706783165
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19078947368421054,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.19078947368421054,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653696,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653696
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.02924188386962882,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.02924188386962882
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.038351539543994194,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.038351539543994194
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577657,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577657
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.040061680838488774,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.040061680838488774
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.18387096774193548,
"acc_stderr": 0.022037217340267836,
"acc_norm": 0.18387096774193548,
"acc_norm_stderr": 0.022037217340267836
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.1724137931034483,
"acc_stderr": 0.026577672183036576,
"acc_norm": 0.1724137931034483,
"acc_norm_stderr": 0.026577672183036576
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.296969696969697,
"acc_stderr": 0.03567969772268048,
"acc_norm": 0.296969696969697,
"acc_norm_stderr": 0.03567969772268048
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20207253886010362,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.20207253886010362,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2153846153846154,
"acc_stderr": 0.020843034557462878,
"acc_norm": 0.2153846153846154,
"acc_norm_stderr": 0.020843034557462878
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959916,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959916
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.024762902678057926,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.024762902678057926
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.036848815213890246,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.036848815213890246
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.20550458715596331,
"acc_stderr": 0.017324352325016015,
"acc_norm": 0.20550458715596331,
"acc_norm_stderr": 0.017324352325016015
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.19444444444444445,
"acc_stderr": 0.026991454502036726,
"acc_norm": 0.19444444444444445,
"acc_norm_stderr": 0.026991454502036726
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28921568627450983,
"acc_stderr": 0.03182231867647553,
"acc_norm": 0.28921568627450983,
"acc_norm_stderr": 0.03182231867647553
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2825112107623318,
"acc_stderr": 0.03021683101150877,
"acc_norm": 0.2825112107623318,
"acc_norm_stderr": 0.03021683101150877
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2085889570552147,
"acc_stderr": 0.03192193448934723,
"acc_norm": 0.2085889570552147,
"acc_norm_stderr": 0.03192193448934723
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578729,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578729
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.29914529914529914,
"acc_stderr": 0.029996951858349497,
"acc_norm": 0.29914529914529914,
"acc_norm_stderr": 0.029996951858349497
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24776500638569604,
"acc_stderr": 0.015438083080568973,
"acc_norm": 0.24776500638569604,
"acc_norm_stderr": 0.015438083080568973
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2630057803468208,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.2630057803468208,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.024170840879341005,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.024170840879341005
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2797427652733119,
"acc_stderr": 0.0254942593506949,
"acc_norm": 0.2797427652733119,
"acc_norm_stderr": 0.0254942593506949
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.023468429832451163,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.023468429832451163
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22340425531914893,
"acc_stderr": 0.024847921358063962,
"acc_norm": 0.22340425531914893,
"acc_norm_stderr": 0.024847921358063962
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2301173402868318,
"acc_stderr": 0.010750183177375548,
"acc_norm": 0.2301173402868318,
"acc_norm_stderr": 0.010750183177375548
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19852941176470587,
"acc_stderr": 0.024231013370541097,
"acc_norm": 0.19852941176470587,
"acc_norm_stderr": 0.024231013370541097
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.017776947157528034,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.017776947157528034
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.19090909090909092,
"acc_stderr": 0.03764425585984926,
"acc_norm": 0.19090909090909092,
"acc_norm_stderr": 0.03764425585984926
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.21224489795918366,
"acc_stderr": 0.026176967197866764,
"acc_norm": 0.21224489795918366,
"acc_norm_stderr": 0.026176967197866764
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22885572139303484,
"acc_stderr": 0.02970528405677244,
"acc_norm": 0.22885572139303484,
"acc_norm_stderr": 0.02970528405677244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.26900584795321636,
"acc_stderr": 0.03401052620104089,
"acc_norm": 0.26900584795321636,
"acc_norm_stderr": 0.03401052620104089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.3999657351833433,
"mc2_stderr": 0.013864802658827929
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Dugald/llm_training_marketing | 2023-09-18T15:50:04.000Z | [
"region:us"
] | Dugald | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: product
dtype: string
- name: description
dtype: string
- name: marketing_email
dtype: string
splits:
- name: train
num_bytes: 20038
num_examples: 10
download_size: 26247
dataset_size: 20038
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llm_training_marketing"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
austinpatrickm/patterns | 2023-09-18T16:11:59.000Z | [
"region:us"
] | austinpatrickm | null | null | null | 0 | 0 | Entry not found |
Pielgrin/pierre | 2023-09-18T16:14:02.000Z | [
"region:us"
] | Pielgrin | null | null | null | 0 | 0 | Entry not found |
MasterThesisCBS/Lambada_Norwegian | 2023-09-18T16:17:22.000Z | [
"region:us"
] | MasterThesisCBS | null | null | null | 0 | 0 | Entry not found |
jtlowell/class_tarot | 2023-09-18T16:32:18.000Z | [
"region:us"
] | jtlowell | null | null | null | 0 | 0 | Entry not found |
nguyenthanhdo/vhac_v2 | 2023-09-18T16:35:46.000Z | [
"region:us"
] | nguyenthanhdo | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 346229589
num_examples: 108658
download_size: 163968580
dataset_size: 346229589
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "vhac_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
manu/europarl-en-fr | 2023-09-18T16:41:47.000Z | [
"region:us"
] | manu | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 685175635
num_examples: 2051014
download_size: 413609385
dataset_size: 685175635
---
# Dataset Card for "europarl-en-fr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dune10991/lodka | 2023-09-18T16:52:33.000Z | [
"region:us"
] | dune10991 | null | null | null | 0 | 0 | Entry not found |
dune10991/hut | 2023-09-19T02:21:06.000Z | [
"region:us"
] | dune10991 | null | null | null | 0 | 0 | Entry not found |
Goldyhghoul/LouisBeaters | 2023-09-18T17:16:29.000Z | [
"region:us"
] | Goldyhghoul | null | null | null | 0 | 0 | ---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/datasetcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/datasets-cards
{}
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Lonaz/Elaina_3D | 2023-09-18T17:11:26.000Z | [
"region:us"
] | Lonaz | null | null | null | 0 | 0 | Entry not found |
CyberHarem/hitamu_kyan_futokunoguild | 2023-09-18T17:29:41.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Hitamu Kyan
This is the dataset of Hitamu Kyan, containing 296 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 296 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 719 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 296 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 296 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 296 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 296 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 296 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 719 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 719 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 719 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
linhqyy/whisper_largev2_test_results | 2023-09-18T17:35:37.000Z | [
"region:us"
] | linhqyy | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: predictions
dtype: string
- name: references
dtype: string
splits:
- name: test
num_bytes: 70565
num_examples: 748
download_size: 36811
dataset_size: 70565
---
# Dataset Card for "whisper_largev2_test_results"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Viniciaao/Gab | 2023-09-18T17:46:16.000Z | [
"license:openrail",
"region:us"
] | Viniciaao | null | null | null | 0 | 0 | ---
license: openrail
---
|
CyberHarem/maidena_ange_futokunoguild | 2023-09-18T17:45:45.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Maidena Ange
This is the dataset of Maidena Ange, containing 220 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 220 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 542 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 220 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 220 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 220 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 220 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 220 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 542 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 542 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 542 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/toxico_dannar_futokunoguild | 2023-09-18T18:09:10.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Toxico Dannar
This is the dataset of Toxico Dannar, containing 270 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 270 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 613 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 270 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 270 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 270 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 270 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 270 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 613 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 613 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 613 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
fundrais123/processed_demo | 2023-09-18T18:14:42.000Z | [
"region:us"
] | fundrais123 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: package_name
dtype: string
- name: review
dtype: string
- name: date
dtype: string
- name: star
dtype: int64
- name: version_id
dtype: int64
splits:
- name: train
num_bytes: 1508
num_examples: 5
- name: test
num_bytes: 956
num_examples: 5
download_size: 9451
dataset_size: 2464
---
# Dataset Card for "processed_demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NobodyExistsOnTheInternet/sharegptPIPPA | 2023-09-18T18:28:28.000Z | [
"license:mit",
"region:us"
] | NobodyExistsOnTheInternet | null | null | null | 0 | 0 | ---
license: mit
---
|
CyberHarem/hanabata_nohkins_futokunoguild | 2023-09-18T18:27:58.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Hanabata Nohkins
This is the dataset of Hanabata Nohkins, containing 225 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 225 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 523 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 225 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 225 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 225 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 225 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 225 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 523 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 523 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 523 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
clem/prompts | 2023-09-22T01:19:50.000Z | [
"license:apache-2.0",
"region:us"
] | clem | null | null | null | 1 | 0 | ---
license: apache-2.0
---
This is my collection of prompts to increase my productivity as a co-founder and CEO at Hugging Face |
CyberHarem/enome_futokunoguild | 2023-09-18T18:38:54.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Enome
This is the dataset of Enome, containing 146 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 146 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 350 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 146 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 146 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 146 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 146 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 146 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 350 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 350 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 350 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
NobodyExistsOnTheInternet/expSharePippa | 2023-09-18T18:48:56.000Z | [
"license:mit",
"region:us"
] | NobodyExistsOnTheInternet | null | null | null | 0 | 0 | ---
license: mit
---
|
razavistag/test-dataset | 2023-09-18T19:01:45.000Z | [
"region:us"
] | razavistag | null | null | null | 0 | 0 | Entry not found |
Zerenidel/Ulti_OP | 2023-09-19T22:31:28.000Z | [
"region:us"
] | Zerenidel | null | null | null | 0 | 0 | Entry not found |
Ali-C137/MAD-Main-Test | 2023-09-18T19:05:12.000Z | [
"region:us"
] | Ali-C137 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: GenId
dtype: int64
- name: SubId
dtype: int64
- name: DatasetName
dtype: string
- name: DatasetLink
dtype: string
- name: Text
dtype: string
- name: MetaData
struct:
- name: __index_level_0__
dtype: int64
- name: created_date
dtype: string
- name: deleted
dtype: bool
- name: detoxify
dtype: 'null'
- name: emojis
struct:
- name: count
sequence: int32
- name: name
sequence: string
- name: id
dtype: string
- name: labels
struct:
- name: count
sequence: int32
- name: name
sequence: string
- name: value
sequence: float64
- name: lang
dtype: string
- name: message_id
dtype: string
- name: message_tree_id
dtype: string
- name: model_name
dtype: 'null'
- name: parent_id
dtype: string
- name: rank
dtype: float64
- name: review_count
dtype: int32
- name: review_result
dtype: bool
- name: role
dtype: string
- name: synthetic
dtype: bool
- name: tree_state
dtype: string
- name: user_id
dtype: string
- name: ConcatenatedText
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 87616889
num_examples: 67073
download_size: 34138667
dataset_size: 87616889
---
# Dataset Card for "MAD-Main-Test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Stranic070/Almaz | 2023-09-18T19:14:28.000Z | [
"region:us"
] | Stranic070 | null | null | null | 0 | 0 | Entry not found |
asoria/sample-script | 2023-09-18T19:27:15.000Z | [
"task_categories:text-classification",
"task_ids:sentiment-analysis",
"annotations_creators:expert-generated",
"language_creators:found",
"multilinguality:monolingual",
"size_categories:1K<n<10K",
"source_datasets:original",
"language:yo",
"license:unknown",
"movie reviews",
"nollywood",
"arxi... | asoria | YOSM: A NEW YORUBA SENTIMENT CORPUS FOR MOVIE REVIEWS
- Yoruba | @inproceedings{
shode2022yosm,
title={{YOSM}: A {NEW} {YORUBA} {SENTIMENT} {CORPUS} {FOR} {MOVIE} {REVIEWS}},
author={Iyanuoluwa Shode and David Ifeoluwa Adelani and Anna Feldman},
booktitle={3rd Workshop on African Natural Language Processing},
year={2022},
url={https://openreview.net/forum?id=rRzx5qzVIb9}
} | null | 0 | 0 | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- yo
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
tags:
- movie reviews
- nollywood
task_categories:
- text-classification
task_ids:
- sentiment-analysis
---
# Dataset Card for YOSM
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:** [Iyanuoluwa/YOSM](https://github.com/IyanuSh/YOSM)
- **Paper:** [A new Yorùbá Sentiment Corpus for Nigerian/Nollywood Movie Reviews](https://arxiv.org/pdf/2204.09711.pdf)
- **Point of Contact:** [Iyanuoluwa Shode](mailto:shodei1@montclair.edu)
### Dataset Summary
YOSM is the first Yorùbá sentiment corpus for Nollywood movie reviews. The reviews were collected from movie reviews websites - IMDB, Rotten Tomatoes, LetterboxD, Cinemapointer, and Nollyrated.
### Languages
Yorùbá (ISO 639-1: yo) - the third most spoken indigenous African language with over 50 million speakers.
## Dataset Structure
### Data Instances
An instance consists of a movie review and the corresponding class label.
### Data Fields
- `yo_review`: A movie review in Yorùbá
- `sentiment`: The label describing the sentiment of the movie review.
### Data Splits
The YOSM dataset has 3 splits: _train_, _dev_, and _test_. Below are the statistics for Version 3.0.0 of the dataset.
| Dataset Split | Number of Instances in Split |
| ------------- | ------------------------------------------- |
| Train | 800 |
| Development | 200 |
| Test | 500 |
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
|
Gooly/dataset | 2023-09-18T19:41:36.000Z | [
"license:mit",
"region:us"
] | Gooly | null | null | null | 0 | 0 | ---
license: mit
---
|
Resizable/FuckingEncrustedTesticle | 2023-09-18T19:50:22.000Z | [
"license:openrail",
"region:us"
] | Resizable | null | null | null | 0 | 0 | ---
license: openrail
---
|
benderrodriguez/audio-v2 | 2023-09-19T00:04:31.000Z | [
"region:us"
] | benderrodriguez | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_migtissera__Synthia-34B-v1.2 | 2023-09-18T20:07:00.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of migtissera/Synthia-34B-v1.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [migtissera/Synthia-34B-v1.2](https://huggingface.co/migtissera/Synthia-34B-v1.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Synthia-34B-v1.2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T20:05:34.645170](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-34B-v1.2/blob/main/results_2023-09-18T20-05-34.645170.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5320903185183409,\n\
\ \"acc_stderr\": 0.03517517994960793,\n \"acc_norm\": 0.5358397153796313,\n\
\ \"acc_norm_stderr\": 0.03516397638431902,\n \"mc1\": 0.2998776009791922,\n\
\ \"mc1_stderr\": 0.01604035296671362,\n \"mc2\": 0.4467341818408572,\n\
\ \"mc2_stderr\": 0.014969799807071376\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5119453924914675,\n \"acc_stderr\": 0.014607220340597171,\n\
\ \"acc_norm\": 0.5486348122866894,\n \"acc_norm_stderr\": 0.01454210456995527\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5587532364070902,\n\
\ \"acc_stderr\": 0.00495521278783238,\n \"acc_norm\": 0.7432782314280024,\n\
\ \"acc_norm_stderr\": 0.004359318206428689\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.506578947368421,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.506578947368421,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.5433962264150943,\n \"acc_stderr\": 0.030656748696739428,\n \
\ \"acc_norm\": 0.5433962264150943,\n \"acc_norm_stderr\": 0.030656748696739428\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n\
\ \"acc_stderr\": 0.04140685639111502,\n \"acc_norm\": 0.5694444444444444,\n\
\ \"acc_norm_stderr\": 0.04140685639111502\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5028901734104047,\n\
\ \"acc_stderr\": 0.03812400565974834,\n \"acc_norm\": 0.5028901734104047,\n\
\ \"acc_norm_stderr\": 0.03812400565974834\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.045126085985421296,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.045126085985421296\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997692,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997692\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145634,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145634\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6225806451612903,\n\
\ \"acc_stderr\": 0.027575960723278243,\n \"acc_norm\": 0.6225806451612903,\n\
\ \"acc_norm_stderr\": 0.027575960723278243\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3842364532019704,\n \"acc_stderr\": 0.0342239856565755,\n\
\ \"acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.0342239856565755\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031596,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031596\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.033586181457325226,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.033586181457325226\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7305699481865285,\n \"acc_stderr\": 0.032018671228777947,\n\
\ \"acc_norm\": 0.7305699481865285,\n \"acc_norm_stderr\": 0.032018671228777947\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.02534267129380725,\n \
\ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.02534267129380725\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114982,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114982\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.032183581077426124,\n\
\ \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.032183581077426124\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659806,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6954128440366972,\n \"acc_stderr\": 0.019732299420354052,\n \"\
acc_norm\": 0.6954128440366972,\n \"acc_norm_stderr\": 0.019732299420354052\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4027777777777778,\n \"acc_stderr\": 0.033448873829978666,\n \"\
acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.033448873829978666\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7107843137254902,\n \"acc_stderr\": 0.031822318676475544,\n \"\
acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.031822318676475544\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7172995780590717,\n \"acc_stderr\": 0.02931281415395593,\n \
\ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.02931281415395593\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5426008968609866,\n\
\ \"acc_stderr\": 0.033435777055830646,\n \"acc_norm\": 0.5426008968609866,\n\
\ \"acc_norm_stderr\": 0.033435777055830646\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.043749285605997376,\n\
\ \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.043749285605997376\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884123,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884123\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n\
\ \"acc_stderr\": 0.04616631111801714,\n \"acc_norm\": 0.6481481481481481,\n\
\ \"acc_norm_stderr\": 0.04616631111801714\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.046897659372781335,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.046897659372781335\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7692307692307693,\n\
\ \"acc_stderr\": 0.0276019213814176,\n \"acc_norm\": 0.7692307692307693,\n\
\ \"acc_norm_stderr\": 0.0276019213814176\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6743295019157088,\n\
\ \"acc_stderr\": 0.016757989458549675,\n \"acc_norm\": 0.6743295019157088,\n\
\ \"acc_norm_stderr\": 0.016757989458549675\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5867052023121387,\n \"acc_stderr\": 0.02651126136940925,\n\
\ \"acc_norm\": 0.5867052023121387,\n \"acc_norm_stderr\": 0.02651126136940925\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35977653631284917,\n\
\ \"acc_stderr\": 0.016051419760310263,\n \"acc_norm\": 0.35977653631284917,\n\
\ \"acc_norm_stderr\": 0.016051419760310263\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.028580341065138296,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.028580341065138296\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n\
\ \"acc_stderr\": 0.027648149599751464,\n \"acc_norm\": 0.6141479099678456,\n\
\ \"acc_norm_stderr\": 0.027648149599751464\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5524691358024691,\n \"acc_stderr\": 0.027667138569422708,\n\
\ \"acc_norm\": 0.5524691358024691,\n \"acc_norm_stderr\": 0.027667138569422708\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.37943262411347517,\n \"acc_stderr\": 0.028947338851614105,\n \
\ \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.028947338851614105\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.378748370273794,\n\
\ \"acc_stderr\": 0.012389052105003732,\n \"acc_norm\": 0.378748370273794,\n\
\ \"acc_norm_stderr\": 0.012389052105003732\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.40808823529411764,\n \"acc_stderr\": 0.029855261393483924,\n\
\ \"acc_norm\": 0.40808823529411764,\n \"acc_norm_stderr\": 0.029855261393483924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.48366013071895425,\n \"acc_stderr\": 0.020217030653186457,\n \
\ \"acc_norm\": 0.48366013071895425,\n \"acc_norm_stderr\": 0.020217030653186457\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n\
\ \"acc_stderr\": 0.047381987035454834,\n \"acc_norm\": 0.5727272727272728,\n\
\ \"acc_norm_stderr\": 0.047381987035454834\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.031343283582089536,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.031343283582089536\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03615507630310935,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03615507630310935\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2998776009791922,\n\
\ \"mc1_stderr\": 0.01604035296671362,\n \"mc2\": 0.4467341818408572,\n\
\ \"mc2_stderr\": 0.014969799807071376\n }\n}\n```"
repo_url: https://huggingface.co/migtissera/Synthia-34B-v1.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|arc:challenge|25_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hellaswag|10_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T20-05-34.645170.parquet'
- config_name: results
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- results_2023-09-18T20-05-34.645170.parquet
- split: latest
path:
- results_2023-09-18T20-05-34.645170.parquet
---
# Dataset Card for Evaluation run of migtissera/Synthia-34B-v1.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/migtissera/Synthia-34B-v1.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [migtissera/Synthia-34B-v1.2](https://huggingface.co/migtissera/Synthia-34B-v1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__Synthia-34B-v1.2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T20:05:34.645170](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-34B-v1.2/blob/main/results_2023-09-18T20-05-34.645170.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5320903185183409,
"acc_stderr": 0.03517517994960793,
"acc_norm": 0.5358397153796313,
"acc_norm_stderr": 0.03516397638431902,
"mc1": 0.2998776009791922,
"mc1_stderr": 0.01604035296671362,
"mc2": 0.4467341818408572,
"mc2_stderr": 0.014969799807071376
},
"harness|arc:challenge|25": {
"acc": 0.5119453924914675,
"acc_stderr": 0.014607220340597171,
"acc_norm": 0.5486348122866894,
"acc_norm_stderr": 0.01454210456995527
},
"harness|hellaswag|10": {
"acc": 0.5587532364070902,
"acc_stderr": 0.00495521278783238,
"acc_norm": 0.7432782314280024,
"acc_norm_stderr": 0.004359318206428689
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.506578947368421,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.506578947368421,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5433962264150943,
"acc_stderr": 0.030656748696739428,
"acc_norm": 0.5433962264150943,
"acc_norm_stderr": 0.030656748696739428
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111502,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111502
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.03812400565974834,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.03812400565974834
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997692,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997692
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145634,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145634
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6225806451612903,
"acc_stderr": 0.027575960723278243,
"acc_norm": 0.6225806451612903,
"acc_norm_stderr": 0.027575960723278243
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3842364532019704,
"acc_stderr": 0.0342239856565755,
"acc_norm": 0.3842364532019704,
"acc_norm_stderr": 0.0342239856565755
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031596,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031596
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.033586181457325226,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.033586181457325226
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7305699481865285,
"acc_stderr": 0.032018671228777947,
"acc_norm": 0.7305699481865285,
"acc_norm_stderr": 0.032018671228777947
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5128205128205128,
"acc_stderr": 0.02534267129380725,
"acc_norm": 0.5128205128205128,
"acc_norm_stderr": 0.02534267129380725
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114982,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114982
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.032183581077426124,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.032183581077426124
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659806,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6954128440366972,
"acc_stderr": 0.019732299420354052,
"acc_norm": 0.6954128440366972,
"acc_norm_stderr": 0.019732299420354052
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.033448873829978666,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.033448873829978666
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7107843137254902,
"acc_stderr": 0.031822318676475544,
"acc_norm": 0.7107843137254902,
"acc_norm_stderr": 0.031822318676475544
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.02931281415395593,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.02931281415395593
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5426008968609866,
"acc_stderr": 0.033435777055830646,
"acc_norm": 0.5426008968609866,
"acc_norm_stderr": 0.033435777055830646
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5343511450381679,
"acc_stderr": 0.043749285605997376,
"acc_norm": 0.5343511450381679,
"acc_norm_stderr": 0.043749285605997376
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884123,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884123
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.04616631111801714,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.04616631111801714
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.046897659372781335,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.046897659372781335
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7692307692307693,
"acc_stderr": 0.0276019213814176,
"acc_norm": 0.7692307692307693,
"acc_norm_stderr": 0.0276019213814176
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6743295019157088,
"acc_stderr": 0.016757989458549675,
"acc_norm": 0.6743295019157088,
"acc_norm_stderr": 0.016757989458549675
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5867052023121387,
"acc_stderr": 0.02651126136940925,
"acc_norm": 0.5867052023121387,
"acc_norm_stderr": 0.02651126136940925
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35977653631284917,
"acc_stderr": 0.016051419760310263,
"acc_norm": 0.35977653631284917,
"acc_norm_stderr": 0.016051419760310263
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.028580341065138296,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.028580341065138296
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6141479099678456,
"acc_stderr": 0.027648149599751464,
"acc_norm": 0.6141479099678456,
"acc_norm_stderr": 0.027648149599751464
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5524691358024691,
"acc_stderr": 0.027667138569422708,
"acc_norm": 0.5524691358024691,
"acc_norm_stderr": 0.027667138569422708
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.37943262411347517,
"acc_stderr": 0.028947338851614105,
"acc_norm": 0.37943262411347517,
"acc_norm_stderr": 0.028947338851614105
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.378748370273794,
"acc_stderr": 0.012389052105003732,
"acc_norm": 0.378748370273794,
"acc_norm_stderr": 0.012389052105003732
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.40808823529411764,
"acc_stderr": 0.029855261393483924,
"acc_norm": 0.40808823529411764,
"acc_norm_stderr": 0.029855261393483924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.48366013071895425,
"acc_stderr": 0.020217030653186457,
"acc_norm": 0.48366013071895425,
"acc_norm_stderr": 0.020217030653186457
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5727272727272728,
"acc_stderr": 0.047381987035454834,
"acc_norm": 0.5727272727272728,
"acc_norm_stderr": 0.047381987035454834
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.031343283582089536,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.031343283582089536
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03615507630310935,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03615507630310935
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2998776009791922,
"mc1_stderr": 0.01604035296671362,
"mc2": 0.4467341818408572,
"mc2_stderr": 0.014969799807071376
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
newjins228/kl | 2023-09-19T03:46:40.000Z | [
"region:us"
] | newjins228 | null | null | null | 0 | 0 | Entry not found |
Roscall/meiko.naka | 2023-09-18T20:42:37.000Z | [
"region:us"
] | Roscall | null | null | null | 0 | 0 | Entry not found |
AWfaw/ai-hdlcoder-pretokenized-dataset-train | 2023-09-20T18:37:01.000Z | [
"region:us"
] | AWfaw | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: ratio_char_token
dtype: float64
splits:
- name: train
num_bytes: 1052402380
num_examples: 53550
download_size: 337098653
dataset_size: 1052402380
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ai-hdlcoder-pretokenized-dataset-train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RiddleHe/commonsense-baseline-text | 2023-09-27T16:02:42.000Z | [
"region:us"
] | RiddleHe | null | null | null | 0 | 0 | Entry not found |
llm4pm/process_mining_questions | 2023-09-21T07:40:46.000Z | [
"language:en",
"license:gpl-2.0",
"region:us"
] | llm4pm | null | null | null | 0 | 0 | ---
license: gpl-2.0
language:
- en
--- |
untilthend/lite | 2023-09-18T21:46:54.000Z | [
"license:openrail",
"region:us"
] | untilthend | null | null | null | 0 | 0 | ---
license: openrail
---
|
eaglew/crop | 2023-09-18T22:11:38.000Z | [
"region:us"
] | eaglew | null | null | null | 1 | 0 | Entry not found |
BangumiBase/thedemongirlnextdoor | 2023-09-29T09:18:09.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of The Demon Girl Next Door
This is the image base of bangumi The Demon Girl Next Door, we detected 18 characters, 3728 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 1497 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 41 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 43 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 14 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 139 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 149 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 96 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 18 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 8 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 16 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 116 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 364 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 823 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 136 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 46 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 5 | [Download](15/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 16 | 105 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 112 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
BangumiBase/jashinchandropkickx | 2023-09-29T09:24:49.000Z | [
"size_categories:n<1K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- n<1K
---
# Bangumi Image Base of Jashin-chan Dropkick X
This is the image base of bangumi Jashin-chan Dropkick X, we detected 19 characters, 795 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 80 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 124 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 69 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 15 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 27 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 23 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 40 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 6 | [Download](7/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 8 | 24 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 29 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 33 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 55 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 58 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 39 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 26 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 18 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 19 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 5 | [Download](17/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| noise | 105 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.