id stringlengths 2 115 | author stringlengths 2 42 ⌀ | last_modified timestamp[us, tz=UTC] | downloads int64 0 8.87M | likes int64 0 3.84k | paperswithcode_id stringlengths 2 45 ⌀ | tags list | lastModified timestamp[us, tz=UTC] | createdAt stringlengths 24 24 | key stringclasses 1 value | created timestamp[us] | card stringlengths 1 1.01M | embedding list | library_name stringclasses 21 values | pipeline_tag stringclasses 27 values | mask_token null | card_data null | widget_data null | model_index null | config null | transformers_info null | spaces null | safetensors null | transformersInfo null | modelId stringlengths 5 111 ⌀ | embeddings list |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
bambadij/COVID_Vaccine_Tweet_sentiment_analysis_roberta | bambadij | 2023-11-13T04:36:13Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T04:36:13Z | 2023-11-13T04:36:10.000Z | 2023-11-13T04:36:10 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
dtype: int64
splits:
- name: train
num_bytes: 1827789
num_examples: 7999
- name: eval
num_bytes: 527000
num_examples: 2000
download_size: 569067
dataset_size: 2354789
---
# Dataset Card for "COVID_Vaccine_Tweet_sentiment_analysis_roberta"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.2455330491065979,
-0.41569432616233826,
0.042077433317899704,
0.353169322013855,
-0.3100387752056122,
0.06542257219552994,
0.16456447541713715,
0.07778939604759216,
0.8521340489387512,
0.05653510242700577,
-0.8663597106933594,
-1.0529494285583496,
-0.813046932220459,
-0.2570908069610595... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
kaist-ai/volcano-train | kaist-ai | 2023-11-13T11:37:08Z | 0 | 0 | null | [
"task_categories:image-to-text",
"size_categories:1M<n<10M",
"language:en",
"image-to-text",
"image-captioning",
"visual-question-answering",
"region:us"
] | 2023-11-13T11:37:08Z | 2023-11-13T04:40:15.000Z | 2023-11-13T04:40:15 | ---
task_categories:
- image-to-text
language:
- en
tags:
- image-to-text
- image-captioning
- visual-question-answering
size_categories:
- 1M<n<10M
---
# Data details
- **274K multimodal feedback and revision data**
- 558K filtered image-text pairs from LAION/CC/SBU, captioned by BLIP.
- 158K GPT-generated multimodal instruction-following data.
- 450K academic-task-oriented VQA data mixture.
- 40K ShareGPT data
# Data collection

Since no multimodal feedback data for training is publicly available as of this writing and human labeling is costly, we used a proprietary LLM to generate feedback data.
As shown in figure, we use an open-source LMM to provide an initial answer to a question about an image. Since current proprietary LLMs cannot process images, we provide object details in text and captions as a proxy for image. For each data instance, we feed the LLM image information consisting of object details and captions, question, initial response, and gold answer as reference answer, allowing the model to evaluate the given inputs and produce feedback. The proprietary LLM might exploit the gold answer to generate the feedback, which can cause potential inaccuracies in feedback during inference time when it is not provided. To avoid this, we give the LLM clear prompts to use text-formatted image details when generating feedback. When constructing the revision data, we set up the system to predict the existing gold answer as the output, using the feedback data, image, question, and initial response obtained from the previous steps as input, without involving any separate model generation process.
Although Volcano is trained using the language modeling objective in a manner consistent with traditional VLMs, it not only follows instructions but also can provide critical feedback based on image information and subsequently self-revise. This enhanced ability is attributed to Volcano's combined training with visual instruction tuning data, feedback, and revision data. | [
-0.3901001513004303,
-0.7259229421615601,
0.3351708650588989,
0.028560511767864227,
-0.2867847979068756,
-0.091020867228508,
-0.2109857201576233,
-0.4609419107437134,
0.0002360981743549928,
0.6327611207962036,
-0.8654171824455261,
-0.6394075751304626,
-0.304110586643219,
0.0526624470949173... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
sivan22/sefaria-he-new | sivan22 | 2023-11-16T12:56:51Z | 0 | 0 | null | [
"region:us"
] | 2023-11-16T12:56:51Z | 2023-11-13T04:47:14.000Z | 2023-11-13T04:47:14 | Entry not found | [
-0.32276487350463867,
-0.22568444907665253,
0.8622263073921204,
0.43461570143699646,
-0.5282988548278809,
0.7012969255447388,
0.7915717363357544,
0.07618642598390579,
0.7746027112007141,
0.25632190704345703,
-0.7852815389633179,
-0.22573848068714142,
-0.910447895526886,
0.5715675354003906,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
sivan22/sefaria-hebrew | sivan22 | 2023-11-13T05:08:42Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T05:08:42Z | 2023-11-13T04:59:51.000Z | 2023-11-13T04:59:51 | ---
dataset_info:
features:
- name: language
dtype: string
- name: title
dtype: string
- name: versionSource
dtype: string
- name: versionTitle
dtype: string
- name: status
dtype: string
- name: license
dtype: string
- name: versionTitleInHebrew
dtype: string
- name: actualLanguage
dtype: string
- name: isBaseText
dtype: bool
- name: level_1_index
dtype: float64
- name: level_2_index
dtype: float64
- name: level_3_index
dtype: float64
- name: level_4_index
dtype: float64
- name: level_5_index
dtype: float64
- name: text
dtype: string
- name: versionNotes
dtype: string
- name: versionNotesInHebrew
dtype: string
- name: method
dtype: string
- name: digitizedBySefaria
dtype: float64
- name: heversionSource
dtype: string
- name: priority
dtype: float64
- name: shortVersionTitle
dtype: string
- name: purchaseInformationImage
dtype: string
- name: purchaseInformationURL
dtype: string
splits:
- name: train
num_bytes: 1901352817
num_examples: 1955969
download_size: 544170227
dataset_size: 1901352817
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "sefaria-hebrew"
this dataset contains jewish texts in hebrew from the sefaria project | [
-0.141387939453125,
-0.30602481961250305,
-0.41695937514305115,
0.23892243206501007,
-0.5341281890869141,
-0.06303978711366653,
0.1203487366437912,
-0.16453146934509277,
0.6338688135147095,
0.876635730266571,
-0.5432236194610596,
-0.9786520600318909,
-0.5622424483299255,
-0.138660669326782... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
joey1895/rename | joey1895 | 2023-11-13T05:17:40Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-13T05:17:40Z | 2023-11-13T05:17:40.000Z | 2023-11-13T05:17:40 | ---
license: mit
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
jovianzm/test | jovianzm | 2023-11-13T05:37:53Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T05:37:53Z | 2023-11-13T05:37:38.000Z | 2023-11-13T05:37:38 | Entry not found | [
-0.32276472449302673,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965679168701,
0.7915717363357544,
0.07618629932403564,
0.7746022939682007,
0.2563222646713257,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
bambadij/Tweet_sentiment_analysis_Distilbert | bambadij | 2023-11-13T05:54:50Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T05:54:50Z | 2023-11-13T05:54:48.000Z | 2023-11-13T05:54:48 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
dtype: int64
splits:
- name: train
num_bytes: 1712789
num_examples: 7999
- name: eval
num_bytes: 472000
num_examples: 2000
download_size: 505986
dataset_size: 2184789
---
# Dataset Card for "Tweet_sentiment_analysis_Distilbert"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.3752391040325165,
-0.36787286400794983,
0.179892897605896,
0.6246352195739746,
-0.40593186020851135,
0.4119492769241333,
0.10343652963638306,
0.2185972034931183,
0.7914137840270996,
0.005927815102040768,
-0.9353554844856262,
-1.006704568862915,
-1.005680799484253,
-0.3454241454601288,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Hoonhun/llama4recipe | Hoonhun | 2023-11-13T06:14:41Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T06:14:41Z | 2023-11-13T06:07:19.000Z | 2023-11-13T06:07:19 | Entry not found | [
-0.32276472449302673,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965679168701,
0.7915717363357544,
0.07618629932403564,
0.7746022939682007,
0.2563222646713257,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Simonk97/ungquachung | Simonk97 | 2023-11-13T06:25:21Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-13T06:25:21Z | 2023-11-13T06:23:38.000Z | 2023-11-13T06:23:38 | ---
license: openrail
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Edmon02/AudioBook | Edmon02 | 2023-11-13T06:26:35Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-13T06:26:35Z | 2023-11-13T06:26:35.000Z | 2023-11-13T06:26:35 | ---
license: apache-2.0
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Raspberry-ai/aritzia8AT60TjozU_raw_v1 | Raspberry-ai | 2023-11-13T06:35:07Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T06:35:07Z | 2023-11-13T06:34:59.000Z | 2023-11-13T06:34:59 | ---
dataset_info:
features:
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: image_path
dtype: string
- name: brand_name
dtype: string
- name: overall_description
dtype: string
- name: style_description
dtype: string
- name: product_category
dtype: string
- name: product_group_category
dtype: string
- name: section_name
dtype: string
- name: created_at
dtype: string
- name: image_urls
dtype: string
- name: images
dtype: string
- name: images_path
sequence: string
- name: source
dtype: string
- name: caption
dtype: string
- name: s3_uris
sequence: string
splits:
- name: train
num_bytes: 12837088
num_examples: 2768
download_size: 4860063
dataset_size: 12837088
---
# Dataset Card for "aritzia_raw_dataset_v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.6240705847740173,
-0.2286859005689621,
0.0544087179005146,
0.3517642319202423,
-0.29162752628326416,
-0.46705445647239685,
0.37022921442985535,
-0.19074054062366486,
0.8461124300956726,
0.5345038175582886,
-1.059027075767517,
-0.8050422668457031,
-0.47520384192466736,
-0.233981102705001... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
nlplabtdtu/OpenOrca-predict-people-action-vi | nlplabtdtu | 2023-11-13T07:02:54Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T07:02:54Z | 2023-11-13T07:02:31.000Z | 2023-11-13T07:02:31 | Entry not found | [
-0.32276472449302673,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965679168701,
0.7915717363357544,
0.07618629932403564,
0.7746022939682007,
0.2563222646713257,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Pablao0948/Little_Hair | Pablao0948 | 2023-11-13T07:21:30Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-13T07:21:30Z | 2023-11-13T07:20:36.000Z | 2023-11-13T07:20:36 | ---
license: openrail
---
| [
-0.1285339742898941,
-0.18616800010204315,
0.6529127359390259,
0.4943626821041107,
-0.1931934952735901,
0.2360742688179016,
0.360720157623291,
0.05056300014257431,
0.5793654322624207,
0.7400140166282654,
-0.6508105993270874,
-0.23783984780311584,
-0.7102248668670654,
-0.047826044261455536,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
wesley7137/neuro_qa_SFT_Trainer | wesley7137 | 2023-11-13T09:59:24Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T09:59:24Z | 2023-11-13T07:29:17.000Z | 2023-11-13T07:29:17 | Entry not found | [
-0.32276487350463867,
-0.22568444907665253,
0.8622263073921204,
0.43461570143699646,
-0.5282988548278809,
0.7012969255447388,
0.7915717363357544,
0.07618642598390579,
0.7746027112007141,
0.25632190704345703,
-0.7852815389633179,
-0.22573848068714142,
-0.910447895526886,
0.5715675354003906,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
kedzkiest/MySummary | kedzkiest | 2023-11-13T07:35:20Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T07:35:20Z | 2023-11-13T07:35:20.000Z | 2023-11-13T07:35:20 | Entry not found | [
-0.32276487350463867,
-0.22568444907665253,
0.8622263073921204,
0.43461570143699646,
-0.5282988548278809,
0.7012969255447388,
0.7915717363357544,
0.07618642598390579,
0.7746027112007141,
0.25632190704345703,
-0.7852815389633179,
-0.22573848068714142,
-0.910447895526886,
0.5715675354003906,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Fiaa/dClub | Fiaa | 2023-11-13T08:07:03Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T08:07:03Z | 2023-11-13T07:57:01.000Z | 2023-11-13T07:57:01 | ## Dataset Description
**dCluB** is the collected dataset in the paper: Dynamic Clue Bottlenecks: Interpretable by Design Visual Question Answering.
Given an image and question, it's goal is to create interpretable and faithful by design visual question answering models, through generating the intermediate structures "visual clues".
The dataset was carefully collected from mTurk.
In the paper, we provide our own method to obtain insights into how bottleneck models on VQA can be constructed, and did much analysis to see how future work might try to mitigate blackbox models’ shortcomings.
We aim for dCluB to serve as a useful set for advancing the development of interpretable and faithful VQA models, and driving further progress in the field.
## Download Data
The train, val, and test sets are in separate json files, and the images are from coco images used in vqa v2. You can load the data as follows:
```
from datasets import load_dataset
examples = load_dataset('Fiaa/dCluB')
```
---
license: apache-2.0
---
| [
-0.39561229944229126,
-0.8408971428871155,
0.3126218914985657,
0.07952626794576645,
-0.5520505309104919,
0.3933649957180023,
0.1751677244901657,
-0.5485429167747498,
-0.284855455160141,
0.5941454172134399,
-1.2019541263580322,
-0.45021387934684753,
-0.16997072100639343,
0.1179223582148552,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Back-up/temp-qa-context | Back-up | 2023-11-13T14:46:59Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T14:46:59Z | 2023-11-13T07:58:17.000Z | 2023-11-13T07:58:17 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: response
struct:
- name: response
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
- name: instruction
dtype: string
- name: prompt_name
dtype: string
splits:
- name: train
num_bytes: 10011005
num_examples: 3604
download_size: 3752328
dataset_size: 10011005
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "temp-qa-context"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.5202964544296265,
-0.2232261747121811,
0.38748183846473694,
0.12706412374973297,
-0.3886567950248718,
-0.15135101974010468,
0.3305567502975464,
0.05062364414334297,
0.841584324836731,
0.27887436747550964,
-0.7428802251815796,
-0.7310693860054016,
-0.2787470519542694,
-0.2564411759376526... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
JewelC/test | JewelC | 2023-11-13T08:20:54Z | 0 | 0 | null | [
"license:cc-by-nc-sa-4.0",
"region:us"
] | 2023-11-13T08:20:54Z | 2023-11-13T08:20:54.000Z | 2023-11-13T08:20:54 | ---
license: cc-by-nc-sa-4.0
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
TinyPixel/lima-u2 | TinyPixel | 2023-11-13T08:21:57Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T08:21:57Z | 2023-11-13T08:21:56.000Z | 2023-11-13T08:21:56 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1791607
num_examples: 780
download_size: 1041796
dataset_size: 1791607
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "lima-u2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.5726203918457031,
-0.35018470883369446,
0.3366532325744629,
0.6033220887184143,
-0.4725847840309143,
-0.07858796417713165,
0.5210864543914795,
-0.20291145145893097,
0.9275995492935181,
0.4996700882911682,
-0.9105260372161865,
-0.8237468600273132,
-0.7771002650260925,
-0.1111476719379425... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
MichaelJH/Ryu-AI.datadict | MichaelJH | 2023-11-13T08:22:24Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T08:22:24Z | 2023-11-13T08:22:21.000Z | 2023-11-13T08:22:21 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
sequence: string
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 7090578
num_examples: 21460
download_size: 1783799
dataset_size: 7090578
---
# Dataset Card for "Ryu-AI.datadict"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.6099268794059753,
-0.31849440932273865,
0.25279030203819275,
0.1321863979101181,
-0.1849324107170105,
0.060535743832588196,
0.2543180584907532,
-0.18798698484897614,
1.040204405784607,
0.5576010346412659,
-0.9399334192276001,
-0.7008324861526489,
-0.5662872195243835,
-0.0863630548119545... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
poorfish/fishdataset2 | poorfish | 2023-11-13T08:46:23Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-13T08:46:23Z | 2023-11-13T08:46:23.000Z | 2023-11-13T08:46:23 | ---
license: mit
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
ErhaChen/glassmorphism_icon | ErhaChen | 2023-11-13T09:03:04Z | 0 | 0 | null | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:apache-2.0",
"icon",
"style",
"region:us"
] | 2023-11-13T09:03:04Z | 2023-11-13T08:52:35.000Z | 2023-11-13T08:52:35 | ---
license: apache-2.0
task_categories:
- text-to-image
tags:
- icon
- style
size_categories:
- n<1K
--- | [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
kostasGRG/greek-twitter-multimodal-dataset | kostasGRG | 2023-11-13T09:22:15Z | 0 | 0 | null | [
"task_categories:text-classification",
"task_categories:image-classification",
"size_categories:n<1K",
"language:el",
"region:us"
] | 2023-11-13T09:22:15Z | 2023-11-13T09:08:02.000Z | 2023-11-13T09:08:02 | ---
task_categories:
- text-classification
- image-classification
language:
- el
size_categories:
- n<1K
---
This is a dataset for sentiment analysis created by text-image pairs collected from greek Twitter.
posted: from April 2023 to September 2023
Context: general purpose, mostly politics and athletics
Total pairs: 260
The purpose of the dataset is to be used as a test dataset, not for training/fine-tuning a model.
Labels: Negative, Neutral, Positive
labelling.xlsx contains the labels for texts,images and both modalities. | [
-0.7389228940010071,
-0.4226798713207245,
0.44496843218803406,
0.11412573605775833,
-1.0033162832260132,
-0.08271672576665878,
0.08198431134223938,
-0.16964705288410187,
0.7902653217315674,
0.39611196517944336,
-0.8259546756744385,
-0.6719216704368591,
-0.8808875679969788,
0.27816629409790... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Rane2021/Med_train | Rane2021 | 2023-11-16T07:23:44Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-16T07:23:44Z | 2023-11-13T09:17:12.000Z | 2023-11-13T09:17:12 | ---
license: mit
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
PrevenIA/gold_standard_preguntas_suicidio_dataset | PrevenIA | 2023-11-13T09:32:02Z | 0 | 0 | null | [
"task_categories:question-answering",
"language:es",
"license:cc-by-3.0",
"suicidio",
"region:us"
] | 2023-11-13T09:32:02Z | 2023-11-13T09:27:13.000Z | 2023-11-13T09:27:13 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: Referencia
dtype: string
- name: Pregunta
dtype: string
- name: Respuesta
dtype: string
splits:
- name: train
num_bytes: 119693
num_examples: 118
download_size: 53288
dataset_size: 119693
license: cc-by-3.0
task_categories:
- question-answering
language:
- es
tags:
- suicidio
---
# Dataset Card for "gold_standard_preguntas_suicidio_dataset"
This dataset was extracted from the following documents:
- [Telélefono de la Esperanza. Guía del Suicidio.](https://telefonodelaesperanza.org/assets/Guia%20del%20suicidio.pdf).
- [Telélefono de la Esperanza. 5 preguntas frecuentes](https://telefonodelaesperanza.ch/5-preguntas-comunes-sobre-suicidio/)
- [Organización Mundial de la Salud](https://www.who.int/es/news-room/questions-and-answers/item/suicide)
- [Preguntas frecuentes sobre el suicidio. NIMH.](https://www.nimh.nih.gov/sites/default/files/health/publications/espanol/senales-de-advertencia-sobre-el-suicidio/preguntas-frecuentes-sobre-el-suicidio.pdf)
- [La conducta suicida. Información para pacientes, familiares y allegados. Anexo 1 de la Guía de Práctica Clínica de Prevención y Tratamiento de la Conducta Suicida. Ministerio de Sanidad, 2012.](https://www.fsme.es/centro-de-documentaci%C3%B3n-sobre-conducta-suicida/gu%C3%ADas-sobre-conducta-suicida/la-conducta-suicida-gpc-sns/)
- [Revisión de la Guía de Práctica Clínica de Prevención y Tratamiento de la Conducta Suicida (2012) del Programa de GPC en el SNS GUÍAS DE PRÁCTICA CLÍNICA EN EL SNS MINISTERIO DE SANIDAD](https://www.fsme.es/centro-de-documentaci%C3%B3n-sobre-conducta-suicida/gu%C3%ADas-sobre-conducta-suicida/gpc/) | [
-0.22827225923538208,
-0.37328627705574036,
0.45546287298202515,
0.12315794825553894,
-0.2492489367723465,
-0.41944077610969543,
0.13806678354740143,
0.13752366602420807,
0.27098771929740906,
0.6676489114761353,
-0.6393432021141052,
-1.2436615228652954,
-0.6652822494506836,
0.3068708181381... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
DLI-Lab/ESC-6K-ft-llama2-wo-filtering | DLI-Lab | 2023-11-13T09:30:03Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T09:30:03Z | 2023-11-13T09:29:53.000Z | 2023-11-13T09:29:53 | ---
dataset_info:
features:
- name: strg_pred
dtype: string
- name: strg_gold
dtype: string
- name: situation
dtype: string
- name: problem_type
dtype: string
- name: emotion_type
dtype: string
- name: context
dtype: string
- name: sample_id
dtype: string
- name: res_gold
dtype: string
- name: res_pred
dtype: string
splits:
- name: train
num_bytes: 18568804
num_examples: 10144
- name: test
num_bytes: 2294797
num_examples: 1295
- name: valid
num_bytes: 2355255
num_examples: 1271
download_size: 4844600
dataset_size: 23218856
---
# Dataset Card for "ESC-6K-ft-llama2-wo-filtering"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.6579108834266663,
-0.3151746392250061,
0.5481018424034119,
0.1489456444978714,
-0.5686852931976318,
0.07685574889183044,
0.41856861114501953,
-0.46835798025131226,
0.8379498720169067,
0.8511925935745239,
-0.9857582449913025,
-0.9198948740959167,
-0.6631780862808228,
0.01032772846519947,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
BangumiBase/classroomoftheelite | BangumiBase | 2023-11-13T12:31:01Z | 0 | 0 | null | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | 2023-11-13T12:31:01Z | 2023-11-13T09:29:57.000Z | 2023-11-13T09:29:57 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Classroom Of The Elite
This is the image base of bangumi Classroom of the Elite, we detected 58 characters, 4577 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 131 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 108 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 975 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 10 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 16 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 55 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 138 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 193 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 67 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 88 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 48 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 22 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 18 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 45 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 38 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 23 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 51 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 14 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 12 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 12 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 20 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 190 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 11 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 15 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 18 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 12 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 36 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 558 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 23 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 70 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 12 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 58 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 45 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 54 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 34 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 14 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 293 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 38 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 8 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 284 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 42 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 75 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 26 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 136 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 9 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 26 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 12 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 8 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 9 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 28 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 65 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 21 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 9 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 6 | [Download](53/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 54 | 12 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 17 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 18 | [Download](56/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 201 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
| [
-0.6722950339317322,
-0.16140061616897583,
0.12367746978998184,
0.17193682491779327,
-0.21313627064228058,
-0.031005362048745155,
-0.051022082567214966,
-0.31988149881362915,
0.6034467220306396,
0.49963515996932983,
-0.9165456295013428,
-0.8689994812011719,
-0.7020758390426636,
0.500460624... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
danielz01/aerial-animal-population-4tu | danielz01 | 2023-11-13T10:21:15Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T10:21:15Z | 2023-11-13T09:30:57.000Z | 2023-11-13T09:30:57 | ---
dataset_info:
features:
- name: image
dtype: image
- name: objects
struct:
- name: bbox
sequence:
sequence: int64
- name: categories
sequence: string
- name: path
dtype: string
splits:
- name: val
num_bytes: 571961850.0
num_examples: 56
- name: test
num_bytes: 1144658117.0
num_examples: 112
- name: train
num_bytes: 696798667.8
num_examples: 1600
download_size: 2396042886
dataset_size: 2413418634.8
configs:
- config_name: default
data_files:
- split: val
path: data/val-*
- split: test
path: data/test-*
- split: train
path: data/train-*
---
# Dataset Card for "aerial-animal-population-4tu"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.8499516844749451,
0.048477742820978165,
-0.11375582218170166,
0.23708295822143555,
-0.17408913373947144,
0.051931798458099365,
0.4600789546966553,
-0.2569139003753662,
0.8823992013931274,
0.4149587154388428,
-0.6083438992500305,
-0.6036192178726196,
-0.35259827971458435,
0.2161918878555... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
teowu/LSVQ-videos | teowu | 2023-11-13T14:23:27Z | 0 | 1 | null | [
"task_categories:video-classification",
"license:mit",
"video quality assessment",
"region:us"
] | 2023-11-13T14:23:27Z | 2023-11-13T09:31:18.000Z | 2023-11-13T09:31:18 | ---
license: mit
task_categories:
- video-classification
tags:
- video quality assessment
---
This is an **unofficial** copy of the videos in the *LSVQ dataset (Ying et al, CVPR, 2021)*, the largest dataset available for Non-reference Video Quality Assessment (NR-VQA); this is to facilitate research studies on this dataset given that we have received several reports that the original links of the dataset is not available anymore.
*See [FAST-VQA](https://github.com/VQAssessment/FAST-VQA-and-FasterVQA) (Wu et al, ECCV, 2022) or [DOVER](https://github.com/VQAssessment/DOVER) (Wu et al, ICCV, 2023) repo on its converted labels (i.e. quality scores for videos).*
The file links to the labels in either of the repositories above are as follows:
```
--- examplar_data_labels
--- --- train_labels.txt (this is the training set labels of LSVQ)
--- --- LSVQ
--- --- --- labels_test.txt (this is the LSVQ_test test subset)
--- --- --- labels_1080p.txt (this is the LSVQ_1080p test subset)
```
It should be noticed that the copyright of this dataset still belongs to the Facebook Research and LIVE Laboratory in UT Austin, and we may delete this unofficial repo at any time if requested by the copyright holders.
Here is the original copyright notice of this dataset, as follows.
-----------COPYRIGHT NOTICE STARTS WITH THIS LINE------------ Copyright (c) 2020 The University of Texas at Austin All rights reserved.
Permission is hereby granted, without written agreement and without license or royalty fees, to use, copy, modify, and distribute this database (the images, the results and the source files) and its documentation for any purpose, provided that the copyright notice in its entirety appear in all copies of this database, and the original source of this database, Laboratory for Image and Video Engineering (LIVE, http://live.ece.utexas.edu ) at the University of Texas at Austin (UT Austin, http://www.utexas.edu ), is acknowledged in any publication that reports research using this database.
The following papers are to be cited in the bibliography whenever the database is used as:
Z. Ying, M. Mandal, D. Ghadiyaram and A.C. Bovik, "Patch-VQ: ‘Patching Up’ the Video Quality Problem," arXiv 2020.[paper]
Z. Ying, M. Mandal, D. Ghadiyaram and A.C. Bovik, "LIVE Large-Scale Social Video Quality (LSVQ) Database", Online:https://github.com/baidut/PatchVQ, 2020.
IN NO EVENT SHALL THE UNIVERSITY OF TEXAS AT AUSTIN BE LIABLE TO ANY PARTY FOR DIRECT, INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OF THIS DATABASE AND ITS DOCUMENTATION, EVEN IF THE UNIVERSITY OF TEXAS AT AUSTIN HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
THE UNIVERSITY OF TEXAS AT AUSTIN SPECIFICALLY DISCLAIMS ANY WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE DATABASE PROVIDED HEREUNDER IS ON AN "AS IS" BASIS, AND THE UNIVERSITY OF TEXAS AT AUSTIN HAS NO OBLIGATION TO PROVIDE MAINTENANCE, SUPPORT, UPDATES, ENHANCEMENTS, OR MODIFICATIONS.
-----------COPYRIGHT NOTICE ENDS WITH THIS LINE------------ | [
-0.3329315781593323,
-0.6634756922721863,
0.14839136600494385,
0.10974329710006714,
-0.41378507018089294,
-0.12884753942489624,
0.16125763952732086,
-0.032872289419174194,
0.4013228118419647,
0.6583727598190308,
-0.5871196985244751,
-0.2803126275539398,
-0.26563602685928345,
0.094407320022... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
plaguss/ag_news_tutorial | plaguss | 2023-11-13T09:44:04Z | 0 | 0 | null | [
"size_categories:1K<n<10K",
"rlfh",
"argilla",
"human-feedback",
"region:us"
] | 2023-11-13T09:44:04Z | 2023-11-13T09:43:58.000Z | 2023-11-13T09:43:58 | ---
size_categories: 1K<n<10K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for ag_news_tutorial
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("plaguss/ag_news_tutorial")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("plaguss/ag_news_tutorial")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| text | Text from the article | text | True | False |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| label | In which category does this article fit? | label_selection | True | N/A | ['0', '1', '2', '3'] |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
**✨ NEW** The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": "record-0",
"fields": {
"text": "Wall St. Bears Claw Back Into the Black (Reuters) Reuters - Short-sellers, Wall Street\u0027s dwindling\\band of ultra-cynics, are seeing green again."
},
"metadata": {},
"responses": [],
"suggestions": []
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"external_id": "record-0",
"label": [],
"label-suggestion": null,
"label-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"metadata": "{}",
"text": "Wall St. Bears Claw Back Into the Black (Reuters) Reuters - Short-sellers, Wall Street\u0027s dwindling\\band of ultra-cynics, are seeing green again."
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **text** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **label** is of type `label_selection` with the following allowed values ['0', '1', '2', '3'].
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **label-suggestion** is of type `label_selection` with the following allowed values ['0', '1', '2', '3'].
Additionally, we also have two more fields that are optional and are the following:
* **✨ NEW** **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
This dataset contains a collection of news articles. Please label them on the category they belong.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.6975184679031372,
-0.8475273847579956,
0.27705222368240356,
0.20750774443149567,
-0.3679121434688568,
-0.4238126277923584,
-0.010860061272978783,
-0.5636196732521057,
0.675724446773529,
0.726744532585144,
-0.6687340140342712,
-0.8685533404350281,
-0.660597562789917,
0.2566988170146942,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
davanstrien/BanglaBait | davanstrien | 2023-11-13T10:10:27Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T10:10:27Z | 2023-11-13T10:00:42.000Z | 2023-11-13T10:00:42 | Invalid username or password. | [
0.22538813948631287,
-0.8998719453811646,
0.4273532032966614,
0.01545056700706482,
-0.07883036881685257,
0.6044343113899231,
0.6795741319656372,
0.07246866822242737,
0.20425251126289368,
0.8107712864875793,
-0.7993434071540833,
0.2074914574623108,
-0.9463866949081421,
0.3846413493156433,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
kinit-tomassako/ver_claimdetection_demo2 | kinit-tomassako | 2023-11-22T12:51:09Z | 0 | 0 | null | [
"region:us"
] | 2023-11-22T12:51:09Z | 2023-11-13T10:00:51.000Z | 2023-11-13T10:00:51 | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | [
-0.503662645816803,
-0.5130205154418945,
0.18480272591114044,
0.20869813859462738,
-0.3474426865577698,
-0.05577763170003891,
-0.022632518783211708,
-0.6274707913398743,
0.4583321511745453,
0.8103808164596558,
-0.7633895874023438,
-0.9683905243873596,
-0.5347057580947876,
0.125262394547462... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Jafar1373/Jafar | Jafar1373 | 2023-11-13T10:16:25Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T10:16:25Z | 2023-11-13T10:16:25.000Z | 2023-11-13T10:16:25 | Entry not found | [
-0.32276487350463867,
-0.22568444907665253,
0.8622263073921204,
0.43461570143699646,
-0.5282988548278809,
0.7012969255447388,
0.7915717363357544,
0.07618642598390579,
0.7746027112007141,
0.25632190704345703,
-0.7852815389633179,
-0.22573848068714142,
-0.910447895526886,
0.5715675354003906,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
vildanh/az_alpaca_translated | vildanh | 2023-11-13T10:20:47Z | 0 | 1 | null | [
"license:mit",
"region:us"
] | 2023-11-13T10:20:47Z | 2023-11-13T10:19:52.000Z | 2023-11-13T10:19:52 | ---
license: mit
---
| [
-0.1285339742898941,
-0.18616800010204315,
0.6529127359390259,
0.4943626821041107,
-0.1931934952735901,
0.2360742688179016,
0.360720157623291,
0.05056300014257431,
0.5793654322624207,
0.7400140166282654,
-0.6508105993270874,
-0.23783984780311584,
-0.7102248668670654,
-0.047826044261455536,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
achouffe/virgule | achouffe | 2023-11-13T10:36:09Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T10:36:09Z | 2023-11-13T10:34:57.000Z | 2023-11-13T10:34:57 | Entry not found | [
-0.32276487350463867,
-0.22568444907665253,
0.8622263073921204,
0.43461570143699646,
-0.5282988548278809,
0.7012969255447388,
0.7915717363357544,
0.07618642598390579,
0.7746027112007141,
0.25632190704345703,
-0.7852815389633179,
-0.22573848068714142,
-0.910447895526886,
0.5715675354003906,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
ericyu/GVLM_Cropped_256 | ericyu | 2023-11-14T15:05:40Z | 0 | 0 | null | [
"region:us"
] | 2023-11-14T15:05:40Z | 2023-11-13T11:01:52.000Z | 2023-11-13T11:01:52 | ---
dataset_info:
features:
- name: imageA
dtype: image
- name: imageB
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 112278146.48
num_examples: 4558
- name: test
num_bytes: 37388998.684
num_examples: 1519
- name: val
num_bytes: 37425501.773
num_examples: 1519
download_size: 186554180
dataset_size: 187092646.937
---
# Dataset Card for "GVLM_Cropped_256"
This is an official release of the GVLM-CD dataset. In this version, we cropped the images into patches of size 256*256.
If you use GVLM-CD in a scientific publication, we would appreciate using the following citations:
```
@article{zhang2023cross,
title={Cross-domain landslide mapping from large-scale remote sensing images using prototype-guided domain-aware progressive representation learning},
author={Zhang, Xiaokang and Yu, Weikang and Pun, Man-On and Shi, Wenzhong},
journal={ISPRS Journal of Photogrammetry and Remote Sensing},
volume={197},
pages={1--17},
year={2023},
publisher={Elsevier}
}
```
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.4681103229522705,
-0.3496367335319519,
0.29764536023139954,
0.4276117980480194,
-0.5263454914093018,
-0.06793992221355438,
0.04451766237616539,
-0.3266924023628235,
0.1141262874007225,
0.496293306350708,
-0.561720073223114,
-0.808933436870575,
-0.514896810054779,
-0.1196369007229805,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
olivermueller/winereviews | olivermueller | 2023-11-13T11:14:37Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-13T11:14:37Z | 2023-11-13T11:14:37.000Z | 2023-11-13T11:14:37 | ---
license: mit
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
orgcatorg/militarywatchmagazine | orgcatorg | 2023-11-28T04:10:35Z | 0 | 0 | null | [
"region:us"
] | 2023-11-28T04:10:35Z | 2023-11-13T11:28:32.000Z | 2023-11-13T11:28:32 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: content
dtype: string
- name: title
dtype: string
- name: source_link
dtype: string
- name: description
dtype: string
- name: date
dtype: timestamp[ns]
- name: category
sequence: string
splits:
- name: train
num_bytes: 222812
num_examples: 40
download_size: 142752
dataset_size: 222812
---
# Dataset Card for "militarywatchmagazine"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.7124247550964355,
-0.15475209057331085,
0.3247423470020294,
0.0706728845834732,
-0.1376185566186905,
0.06219887733459473,
0.39335349202156067,
0.01409047283232212,
0.7714656591415405,
0.43016836047172546,
-1.3253599405288696,
-0.760187566280365,
-0.58641117811203,
-0.38570964336395264,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
orgcatorg/army-technology | orgcatorg | 2023-11-28T04:10:44Z | 0 | 0 | null | [
"region:us"
] | 2023-11-28T04:10:44Z | 2023-11-13T11:56:54.000Z | 2023-11-13T11:56:54 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: content
dtype: string
- name: title
dtype: string
- name: source_link
dtype: string
- name: description
dtype: string
- name: date
dtype: timestamp[ns]
- name: image
dtype: string
splits:
- name: train
num_bytes: 275008
num_examples: 61
download_size: 169317
dataset_size: 275008
---
# Dataset Card for "army-technology"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.8239667415618896,
-0.4110061824321747,
0.38964906334877014,
0.10738365352153778,
-0.14867229759693146,
0.040302012115716934,
0.578410267829895,
-0.020616736263036728,
0.7105012536048889,
0.43453896045684814,
-0.9563782215118408,
-0.7281417846679688,
-0.6183910369873047,
-0.4071281552314... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_blueapple8259__TinyStories-Alpaca_public | open-llm-leaderboard | 2023-11-13T12:11:33Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T12:11:33Z | 2023-11-13T12:10:48.000Z | 2023-11-13T12:10:48 | ---
pretty_name: Evaluation run of blueapple8259/TinyStories-Alpaca
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [blueapple8259/TinyStories-Alpaca](https://huggingface.co/blueapple8259/TinyStories-Alpaca)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_blueapple8259__TinyStories-Alpaca_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-13T12:08:32.889015](https://huggingface.co/datasets/open-llm-leaderboard/details_blueapple8259__TinyStories-Alpaca_public/blob/main/results_2023-11-13T12-08-32.889015.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2343052459270292,\n\
\ \"acc_stderr\": 0.030014283954142254,\n \"acc_norm\": 0.2339194036543238,\n\
\ \"acc_norm_stderr\": 0.030804772038430715,\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.014896277441041834,\n \"mc2\": 0.46675301460809676,\n\
\ \"mc2_stderr\": 0.016264340534335325,\n \"em\": 0.0012583892617449664,\n\
\ \"em_stderr\": 0.00036305608931191567,\n \"f1\": 0.008077810402684559,\n\
\ \"f1_stderr\": 0.000561047245736677\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.20392491467576793,\n \"acc_stderr\": 0.011774262478702259,\n\
\ \"acc_norm\": 0.23976109215017063,\n \"acc_norm_stderr\": 0.012476304127453961\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25781716789484166,\n\
\ \"acc_stderr\": 0.004365388351563101,\n \"acc_norm\": 0.24915355506871142,\n\
\ \"acc_norm_stderr\": 0.004316389476434519\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.22264150943396227,\n \"acc_stderr\": 0.0256042334708991,\n\
\ \"acc_norm\": 0.22264150943396227,\n \"acc_norm_stderr\": 0.0256042334708991\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n\
\ \"acc_stderr\": 0.034370793441061344,\n \"acc_norm\": 0.2152777777777778,\n\
\ \"acc_norm_stderr\": 0.034370793441061344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.17,\n\
\ \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \
\ \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.0285048564705142,\n\
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.0285048564705142\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2698412698412698,\n \"acc_stderr\": 0.02286083830923207,\n \"\
acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.02286083830923207\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523812,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523812\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.19032258064516128,\n\
\ \"acc_stderr\": 0.022331707611823085,\n \"acc_norm\": 0.19032258064516128,\n\
\ \"acc_norm_stderr\": 0.022331707611823085\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.03127090713297698,\n\
\ \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.03127090713297698\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.15656565656565657,\n \"acc_stderr\": 0.025890520358141454,\n \"\
acc_norm\": 0.15656565656565657,\n \"acc_norm_stderr\": 0.025890520358141454\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.15544041450777202,\n \"acc_stderr\": 0.026148483469153314,\n\
\ \"acc_norm\": 0.15544041450777202,\n \"acc_norm_stderr\": 0.026148483469153314\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2076923076923077,\n \"acc_stderr\": 0.020567539567246794,\n\
\ \"acc_norm\": 0.2076923076923077,\n \"acc_norm_stderr\": 0.020567539567246794\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22962962962962963,\n \"acc_stderr\": 0.025644108639267634,\n \
\ \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.025644108639267634\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.028510251512341923,\n\
\ \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.028510251512341923\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.18543046357615894,\n \"acc_stderr\": 0.03173284384294284,\n \"\
acc_norm\": 0.18543046357615894,\n \"acc_norm_stderr\": 0.03173284384294284\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.2018348623853211,\n \"acc_stderr\": 0.017208579357787572,\n \"\
acc_norm\": 0.2018348623853211,\n \"acc_norm_stderr\": 0.017208579357787572\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.125,\n \"acc_stderr\": 0.022554842722407934,\n \"acc_norm\": 0.125,\n\
\ \"acc_norm_stderr\": 0.022554842722407934\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n \
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\"\
: 0.25738396624472576,\n \"acc_stderr\": 0.028458820991460295,\n \"\
acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.028458820991460295\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.19730941704035873,\n\
\ \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.19730941704035873,\n\
\ \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.29770992366412213,\n \"acc_stderr\": 0.04010358942462202,\n\
\ \"acc_norm\": 0.29770992366412213,\n \"acc_norm_stderr\": 0.04010358942462202\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3305785123966942,\n \"acc_stderr\": 0.04294340845212095,\n \"\
acc_norm\": 0.3305785123966942,\n \"acc_norm_stderr\": 0.04294340845212095\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.03559039531617342,\n\
\ \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.03559039531617342\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23931623931623933,\n\
\ \"acc_stderr\": 0.027951826808924333,\n \"acc_norm\": 0.23931623931623933,\n\
\ \"acc_norm_stderr\": 0.027951826808924333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24904214559386972,\n\
\ \"acc_stderr\": 0.015464676163395969,\n \"acc_norm\": 0.24904214559386972,\n\
\ \"acc_norm_stderr\": 0.015464676163395969\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.023083658586984204,\n\
\ \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.023083658586984204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.20261437908496732,\n \"acc_stderr\": 0.02301544687798565,\n\
\ \"acc_norm\": 0.20261437908496732,\n \"acc_norm_stderr\": 0.02301544687798565\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.20257234726688103,\n\
\ \"acc_stderr\": 0.022827317491059675,\n \"acc_norm\": 0.20257234726688103,\n\
\ \"acc_norm_stderr\": 0.022827317491059675\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2345679012345679,\n \"acc_stderr\": 0.023576881744005716,\n\
\ \"acc_norm\": 0.2345679012345679,\n \"acc_norm_stderr\": 0.023576881744005716\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.21631205673758866,\n \"acc_stderr\": 0.0245617205605628,\n \
\ \"acc_norm\": 0.21631205673758866,\n \"acc_norm_stderr\": 0.0245617205605628\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24641460234680573,\n\
\ \"acc_stderr\": 0.011005971399927234,\n \"acc_norm\": 0.24641460234680573,\n\
\ \"acc_norm_stderr\": 0.011005971399927234\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24836601307189543,\n \"acc_stderr\": 0.017479487001364764,\n \
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.017479487001364764\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.03831305140884601,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.03831305140884601\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.025409301953225678,\n\
\ \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.025409301953225678\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21890547263681592,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.21890547263681592,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21084337349397592,\n\
\ \"acc_stderr\": 0.031755547866299194,\n \"acc_norm\": 0.21084337349397592,\n\
\ \"acc_norm_stderr\": 0.031755547866299194\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.03508771929824563,\n\
\ \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.03508771929824563\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.014896277441041834,\n \"mc2\": 0.46675301460809676,\n\
\ \"mc2_stderr\": 0.016264340534335325\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5185477505919495,\n \"acc_stderr\": 0.014042813708888378\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0012583892617449664,\n \
\ \"em_stderr\": 0.00036305608931191567,\n \"f1\": 0.008077810402684559,\n\
\ \"f1_stderr\": 0.000561047245736677\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/blueapple8259/TinyStories-Alpaca
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|arc:challenge|25_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|drop|3_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|gsm8k|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hellaswag|10_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|winogrande|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-13T12-08-32.889015.parquet'
- config_name: results
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- results_2023-11-13T12-08-32.889015.parquet
- split: latest
path:
- results_2023-11-13T12-08-32.889015.parquet
---
# Dataset Card for Evaluation run of blueapple8259/TinyStories-Alpaca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/blueapple8259/TinyStories-Alpaca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [blueapple8259/TinyStories-Alpaca](https://huggingface.co/blueapple8259/TinyStories-Alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_blueapple8259__TinyStories-Alpaca_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-13T12:08:32.889015](https://huggingface.co/datasets/open-llm-leaderboard/details_blueapple8259__TinyStories-Alpaca_public/blob/main/results_2023-11-13T12-08-32.889015.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2343052459270292,
"acc_stderr": 0.030014283954142254,
"acc_norm": 0.2339194036543238,
"acc_norm_stderr": 0.030804772038430715,
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041834,
"mc2": 0.46675301460809676,
"mc2_stderr": 0.016264340534335325,
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931191567,
"f1": 0.008077810402684559,
"f1_stderr": 0.000561047245736677
},
"harness|arc:challenge|25": {
"acc": 0.20392491467576793,
"acc_stderr": 0.011774262478702259,
"acc_norm": 0.23976109215017063,
"acc_norm_stderr": 0.012476304127453961
},
"harness|hellaswag|10": {
"acc": 0.25781716789484166,
"acc_stderr": 0.004365388351563101,
"acc_norm": 0.24915355506871142,
"acc_norm_stderr": 0.004316389476434519
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03944624162501116,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03944624162501116
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22264150943396227,
"acc_stderr": 0.0256042334708991,
"acc_norm": 0.22264150943396227,
"acc_norm_stderr": 0.0256042334708991
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2152777777777778,
"acc_stderr": 0.034370793441061344,
"acc_norm": 0.2152777777777778,
"acc_norm_stderr": 0.034370793441061344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.0285048564705142,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.0285048564705142
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.02286083830923207,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.02286083830923207
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523812,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523812
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.19032258064516128,
"acc_stderr": 0.022331707611823085,
"acc_norm": 0.19032258064516128,
"acc_norm_stderr": 0.022331707611823085
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.03127090713297698,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.03127090713297698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.15656565656565657,
"acc_stderr": 0.025890520358141454,
"acc_norm": 0.15656565656565657,
"acc_norm_stderr": 0.025890520358141454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.15544041450777202,
"acc_stderr": 0.026148483469153314,
"acc_norm": 0.15544041450777202,
"acc_norm_stderr": 0.026148483469153314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2076923076923077,
"acc_stderr": 0.020567539567246794,
"acc_norm": 0.2076923076923077,
"acc_norm_stderr": 0.020567539567246794
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.025644108639267634,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.025644108639267634
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2605042016806723,
"acc_stderr": 0.028510251512341923,
"acc_norm": 0.2605042016806723,
"acc_norm_stderr": 0.028510251512341923
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.18543046357615894,
"acc_stderr": 0.03173284384294284,
"acc_norm": 0.18543046357615894,
"acc_norm_stderr": 0.03173284384294284
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.2018348623853211,
"acc_stderr": 0.017208579357787572,
"acc_norm": 0.2018348623853211,
"acc_norm_stderr": 0.017208579357787572
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.125,
"acc_stderr": 0.022554842722407934,
"acc_norm": 0.125,
"acc_norm_stderr": 0.022554842722407934
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.028458820991460295,
"acc_norm": 0.25738396624472576,
"acc_norm_stderr": 0.028458820991460295
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.19730941704035873,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.19730941704035873,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.29770992366412213,
"acc_stderr": 0.04010358942462202,
"acc_norm": 0.29770992366412213,
"acc_norm_stderr": 0.04010358942462202
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3305785123966942,
"acc_stderr": 0.04294340845212095,
"acc_norm": 0.3305785123966942,
"acc_norm_stderr": 0.04294340845212095
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2883435582822086,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.2883435582822086,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23931623931623933,
"acc_stderr": 0.027951826808924333,
"acc_norm": 0.23931623931623933,
"acc_norm_stderr": 0.027951826808924333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24904214559386972,
"acc_stderr": 0.015464676163395969,
"acc_norm": 0.24904214559386972,
"acc_norm_stderr": 0.015464676163395969
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.20261437908496732,
"acc_stderr": 0.02301544687798565,
"acc_norm": 0.20261437908496732,
"acc_norm_stderr": 0.02301544687798565
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.20257234726688103,
"acc_stderr": 0.022827317491059675,
"acc_norm": 0.20257234726688103,
"acc_norm_stderr": 0.022827317491059675
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2345679012345679,
"acc_stderr": 0.023576881744005716,
"acc_norm": 0.2345679012345679,
"acc_norm_stderr": 0.023576881744005716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.21631205673758866,
"acc_stderr": 0.0245617205605628,
"acc_norm": 0.21631205673758866,
"acc_norm_stderr": 0.0245617205605628
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24641460234680573,
"acc_stderr": 0.011005971399927234,
"acc_norm": 0.24641460234680573,
"acc_norm_stderr": 0.011005971399927234
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.03831305140884601,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03831305140884601
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19591836734693877,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.19591836734693877,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21890547263681592,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.21890547263681592,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21084337349397592,
"acc_stderr": 0.031755547866299194,
"acc_norm": 0.21084337349397592,
"acc_norm_stderr": 0.031755547866299194
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041834,
"mc2": 0.46675301460809676,
"mc2_stderr": 0.016264340534335325
},
"harness|winogrande|5": {
"acc": 0.5185477505919495,
"acc_stderr": 0.014042813708888378
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931191567,
"f1": 0.008077810402684559,
"f1_stderr": 0.000561047245736677
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7287724018096924,
-0.8217828869819641,
0.2692664563655853,
0.2144022136926651,
-0.17331859469413757,
-0.06486978381872177,
0.00852273404598236,
-0.230766162276268,
0.6135022044181824,
-0.050110507756471634,
-0.5184198021888733,
-0.6789470911026001,
-0.4307105541229248,
0.243141993880271... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Huzreyn/Huzreyn | Huzreyn | 2023-11-13T12:12:26Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-13T12:12:26Z | 2023-11-13T12:11:32.000Z | 2023-11-13T12:11:32 | ---
license: apache-2.0
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
richif/models | richif | 2023-11-13T13:13:05Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T13:13:05Z | 2023-11-13T12:26:59.000Z | 2023-11-13T12:26:59 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
orgcatorg/bulgarianmilitary | orgcatorg | 2023-11-28T04:10:53Z | 0 | 0 | null | [
"region:us"
] | 2023-11-28T04:10:53Z | 2023-11-13T12:55:38.000Z | 2023-11-13T12:55:38 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: content
dtype: string
- name: title
dtype: string
- name: source_link
dtype: string
- name: description
dtype: string
- name: date
dtype: timestamp[ns]
- name: image
dtype: string
splits:
- name: train
num_bytes: 328831
num_examples: 58
download_size: 200761
dataset_size: 328831
---
# Dataset Card for "bulgarianmilitary"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.5322787165641785,
-0.3258887231349945,
0.016947316005825996,
0.4377652108669281,
-0.3306291997432709,
0.03159187361598015,
0.1905675232410431,
-0.14680498838424683,
0.9384552836418152,
0.39341235160827637,
-0.804355800151825,
-0.912885308265686,
-0.545798122882843,
-0.2911825478076935,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_ethzanalytics__pythia-31m_public | open-llm-leaderboard | 2023-11-13T13:04:35Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T13:04:35Z | 2023-11-13T13:03:49.000Z | 2023-11-13T13:03:49 | ---
pretty_name: Evaluation run of ethzanalytics/pythia-31m
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ethzanalytics/pythia-31m](https://huggingface.co/ethzanalytics/pythia-31m) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ethzanalytics__pythia-31m_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-13T13:01:31.225551](https://huggingface.co/datasets/open-llm-leaderboard/details_ethzanalytics__pythia-31m_public/blob/main/results_2023-11-13T13-01-31.225551.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2486090533214635,\n\
\ \"acc_stderr\": 0.030580280893238346,\n \"acc_norm\": 0.24951095231696532,\n\
\ \"acc_norm_stderr\": 0.031375786973211,\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.015392118805015023,\n \"mc2\": 0.49102256781530107,\n\
\ \"mc2_stderr\": 0.015750842651440947,\n \"em\": 0.0006291946308724832,\n\
\ \"em_stderr\": 0.0002568002749723811,\n \"f1\": 0.013650377516778552,\n\
\ \"f1_stderr\": 0.0006539918270891778\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.1697952218430034,\n \"acc_stderr\": 0.010971775157784212,\n\
\ \"acc_norm\": 0.21843003412969283,\n \"acc_norm_stderr\": 0.012074291605700985\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.26309500099581756,\n\
\ \"acc_stderr\": 0.004394136724172986,\n \"acc_norm\": 0.26996614220274845,\n\
\ \"acc_norm_stderr\": 0.00443034623465038\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.03785714465066655,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.03785714465066655\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.025125766484827842,\n\
\ \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.025125766484827842\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\
\ \"acc_stderr\": 0.03126511206173044,\n \"acc_norm\": 0.2138728323699422,\n\
\ \"acc_norm_stderr\": 0.03126511206173044\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2851063829787234,\n \"acc_stderr\": 0.029513196625539355,\n\
\ \"acc_norm\": 0.2851063829787234,\n \"acc_norm_stderr\": 0.029513196625539355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n\
\ \"acc_stderr\": 0.0339549002085611,\n \"acc_norm\": 0.1746031746031746,\n\
\ \"acc_norm_stderr\": 0.0339549002085611\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.24193548387096775,\n \"acc_stderr\": 0.024362599693031103,\n \"\
acc_norm\": 0.24193548387096775,\n \"acc_norm_stderr\": 0.024362599693031103\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.0317852971064275,\n \"acc_norm\"\
: 0.2857142857142857,\n \"acc_norm_stderr\": 0.0317852971064275\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.20606060606060606,\n \"acc_stderr\": 0.031584153240477086,\n\
\ \"acc_norm\": 0.20606060606060606,\n \"acc_norm_stderr\": 0.031584153240477086\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.30303030303030304,\n \"acc_stderr\": 0.032742879140268674,\n \"\
acc_norm\": 0.30303030303030304,\n \"acc_norm_stderr\": 0.032742879140268674\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178263,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178263\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2230769230769231,\n \"acc_stderr\": 0.02110773012724398,\n \
\ \"acc_norm\": 0.2230769230769231,\n \"acc_norm_stderr\": 0.02110773012724398\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275794,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275794\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.36554621848739494,\n \"acc_stderr\": 0.03128217706368461,\n\
\ \"acc_norm\": 0.36554621848739494,\n \"acc_norm_stderr\": 0.03128217706368461\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.035118075718047245,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.035118075718047245\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22935779816513763,\n \"acc_stderr\": 0.018025349724618684,\n \"\
acc_norm\": 0.22935779816513763,\n \"acc_norm_stderr\": 0.018025349724618684\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.36574074074074076,\n \"acc_stderr\": 0.03284738857647206,\n \"\
acc_norm\": 0.36574074074074076,\n \"acc_norm_stderr\": 0.03284738857647206\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604243,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604243\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.29596412556053814,\n\
\ \"acc_stderr\": 0.030636591348699813,\n \"acc_norm\": 0.29596412556053814,\n\
\ \"acc_norm_stderr\": 0.030636591348699813\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794088,\n \"\
acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794088\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.294478527607362,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.294478527607362,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.041577515398656284,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.041577515398656284\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.21794871794871795,\n\
\ \"acc_stderr\": 0.027046857630716688,\n \"acc_norm\": 0.21794871794871795,\n\
\ \"acc_norm_stderr\": 0.027046857630716688\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24265644955300128,\n\
\ \"acc_stderr\": 0.015329888940899879,\n \"acc_norm\": 0.24265644955300128,\n\
\ \"acc_norm_stderr\": 0.015329888940899879\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23410404624277456,\n \"acc_stderr\": 0.02279711027807113,\n\
\ \"acc_norm\": 0.23410404624277456,\n \"acc_norm_stderr\": 0.02279711027807113\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23687150837988827,\n\
\ \"acc_stderr\": 0.01421957078810399,\n \"acc_norm\": 0.23687150837988827,\n\
\ \"acc_norm_stderr\": 0.01421957078810399\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.02355083135199509,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.02355083135199509\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.18971061093247588,\n\
\ \"acc_stderr\": 0.022268196258783228,\n \"acc_norm\": 0.18971061093247588,\n\
\ \"acc_norm_stderr\": 0.022268196258783228\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.023468429832451156,\n\
\ \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.023468429832451156\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2127659574468085,\n \"acc_stderr\": 0.024414612974307696,\n \
\ \"acc_norm\": 0.2127659574468085,\n \"acc_norm_stderr\": 0.024414612974307696\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24445893089960888,\n\
\ \"acc_stderr\": 0.010976425013113907,\n \"acc_norm\": 0.24445893089960888,\n\
\ \"acc_norm_stderr\": 0.010976425013113907\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.39705882352941174,\n \"acc_stderr\": 0.029722152099280058,\n\
\ \"acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.029722152099280058\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25163398692810457,\n \"acc_stderr\": 0.017555818091322253,\n \
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.017555818091322253\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.03831305140884601,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.03831305140884601\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3306122448979592,\n \"acc_stderr\": 0.030116426296540596,\n\
\ \"acc_norm\": 0.3306122448979592,\n \"acc_norm_stderr\": 0.030116426296540596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.029929415408348377,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.029929415408348377\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.22289156626506024,\n\
\ \"acc_stderr\": 0.03240004825594687,\n \"acc_norm\": 0.22289156626506024,\n\
\ \"acc_norm_stderr\": 0.03240004825594687\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.23976608187134502,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.23976608187134502,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.015392118805015023,\n \"mc2\": 0.49102256781530107,\n\
\ \"mc2_stderr\": 0.015750842651440947\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.4972375690607735,\n \"acc_stderr\": 0.014052271211616445\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0006291946308724832,\n \
\ \"em_stderr\": 0.0002568002749723811,\n \"f1\": 0.013650377516778552,\n\
\ \"f1_stderr\": 0.0006539918270891778\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.002274450341167551,\n \"acc_stderr\": 0.0013121578148674168\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ethzanalytics/pythia-31m
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|arc:challenge|25_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|drop|3_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|gsm8k|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hellaswag|10_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T13-01-31.225551.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T13-01-31.225551.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- '**/details_harness|winogrande|5_2023-11-13T13-01-31.225551.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-13T13-01-31.225551.parquet'
- config_name: results
data_files:
- split: 2023_11_13T13_01_31.225551
path:
- results_2023-11-13T13-01-31.225551.parquet
- split: latest
path:
- results_2023-11-13T13-01-31.225551.parquet
---
# Dataset Card for Evaluation run of ethzanalytics/pythia-31m
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ethzanalytics/pythia-31m
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ethzanalytics/pythia-31m](https://huggingface.co/ethzanalytics/pythia-31m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ethzanalytics__pythia-31m_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-13T13:01:31.225551](https://huggingface.co/datasets/open-llm-leaderboard/details_ethzanalytics__pythia-31m_public/blob/main/results_2023-11-13T13-01-31.225551.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2486090533214635,
"acc_stderr": 0.030580280893238346,
"acc_norm": 0.24951095231696532,
"acc_norm_stderr": 0.031375786973211,
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015023,
"mc2": 0.49102256781530107,
"mc2_stderr": 0.015750842651440947,
"em": 0.0006291946308724832,
"em_stderr": 0.0002568002749723811,
"f1": 0.013650377516778552,
"f1_stderr": 0.0006539918270891778
},
"harness|arc:challenge|25": {
"acc": 0.1697952218430034,
"acc_stderr": 0.010971775157784212,
"acc_norm": 0.21843003412969283,
"acc_norm_stderr": 0.012074291605700985
},
"harness|hellaswag|10": {
"acc": 0.26309500099581756,
"acc_stderr": 0.004394136724172986,
"acc_norm": 0.26996614220274845,
"acc_norm_stderr": 0.00443034623465038
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066655,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066655
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21132075471698114,
"acc_stderr": 0.025125766484827842,
"acc_norm": 0.21132075471698114,
"acc_norm_stderr": 0.025125766484827842
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.03126511206173044,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.03126511206173044
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2851063829787234,
"acc_stderr": 0.029513196625539355,
"acc_norm": 0.2851063829787234,
"acc_norm_stderr": 0.029513196625539355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.0339549002085611,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.0339549002085611
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24193548387096775,
"acc_stderr": 0.024362599693031103,
"acc_norm": 0.24193548387096775,
"acc_norm_stderr": 0.024362599693031103
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0317852971064275,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0317852971064275
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.20606060606060606,
"acc_stderr": 0.031584153240477086,
"acc_norm": 0.20606060606060606,
"acc_norm_stderr": 0.031584153240477086
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.30303030303030304,
"acc_stderr": 0.032742879140268674,
"acc_norm": 0.30303030303030304,
"acc_norm_stderr": 0.032742879140268674
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178263,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178263
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2230769230769231,
"acc_stderr": 0.02110773012724398,
"acc_norm": 0.2230769230769231,
"acc_norm_stderr": 0.02110773012724398
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275794,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275794
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36554621848739494,
"acc_stderr": 0.03128217706368461,
"acc_norm": 0.36554621848739494,
"acc_norm_stderr": 0.03128217706368461
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.035118075718047245,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.035118075718047245
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22935779816513763,
"acc_stderr": 0.018025349724618684,
"acc_norm": 0.22935779816513763,
"acc_norm_stderr": 0.018025349724618684
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.36574074074074076,
"acc_stderr": 0.03284738857647206,
"acc_norm": 0.36574074074074076,
"acc_norm_stderr": 0.03284738857647206
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604243,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604243
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.29596412556053814,
"acc_stderr": 0.030636591348699813,
"acc_norm": 0.29596412556053814,
"acc_norm_stderr": 0.030636591348699813
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.294478527607362,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.294478527607362,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.041577515398656284,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.041577515398656284
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.027046857630716688,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.027046857630716688
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24265644955300128,
"acc_stderr": 0.015329888940899879,
"acc_norm": 0.24265644955300128,
"acc_norm_stderr": 0.015329888940899879
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23410404624277456,
"acc_stderr": 0.02279711027807113,
"acc_norm": 0.23410404624277456,
"acc_norm_stderr": 0.02279711027807113
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23687150837988827,
"acc_stderr": 0.01421957078810399,
"acc_norm": 0.23687150837988827,
"acc_norm_stderr": 0.01421957078810399
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.02355083135199509,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.02355083135199509
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.18971061093247588,
"acc_stderr": 0.022268196258783228,
"acc_norm": 0.18971061093247588,
"acc_norm_stderr": 0.022268196258783228
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.023468429832451156,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.023468429832451156
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2127659574468085,
"acc_stderr": 0.024414612974307696,
"acc_norm": 0.2127659574468085,
"acc_norm_stderr": 0.024414612974307696
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24445893089960888,
"acc_stderr": 0.010976425013113907,
"acc_norm": 0.24445893089960888,
"acc_norm_stderr": 0.010976425013113907
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.029722152099280058,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.029722152099280058
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.017555818091322253,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.017555818091322253
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.03831305140884601,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03831305140884601
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3306122448979592,
"acc_stderr": 0.030116426296540596,
"acc_norm": 0.3306122448979592,
"acc_norm_stderr": 0.030116426296540596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348377,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348377
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-virology|5": {
"acc": 0.22289156626506024,
"acc_stderr": 0.03240004825594687,
"acc_norm": 0.22289156626506024,
"acc_norm_stderr": 0.03240004825594687
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23976608187134502,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.23976608187134502,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015023,
"mc2": 0.49102256781530107,
"mc2_stderr": 0.015750842651440947
},
"harness|winogrande|5": {
"acc": 0.4972375690607735,
"acc_stderr": 0.014052271211616445
},
"harness|drop|3": {
"em": 0.0006291946308724832,
"em_stderr": 0.0002568002749723811,
"f1": 0.013650377516778552,
"f1_stderr": 0.0006539918270891778
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.0013121578148674168
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7294182181358337,
-0.8540099263191223,
0.2983059585094452,
0.1941162794828415,
-0.17196471989154816,
-0.0401102714240551,
-0.014686452224850655,
-0.22161096334457397,
0.5691171884536743,
-0.08965501189231873,
-0.47532665729522705,
-0.7081304788589478,
-0.42081570625305176,
0.22982671856... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Definite/processed_bert_dataset | Definite | 2023-11-13T13:15:04Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T13:15:04Z | 2023-11-13T13:15:03.000Z | 2023-11-13T13:15:03 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 3600000.0
num_examples: 1000
download_size: 1132443
dataset_size: 3600000.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "processed_bert_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.6317390203475952,
-0.39490506052970886,
0.24874278903007507,
0.3632911145687103,
-0.23787306249141693,
-0.08467477560043335,
0.08236835151910782,
-0.33850884437561035,
0.8747080564498901,
0.5211737155914307,
-1.0386217832565308,
-0.65530925989151,
-0.5098501443862915,
-0.377997368574142... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
yxchng/laion_synthetic_filtered_large | yxchng | 2023-11-13T13:16:36Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T13:16:36Z | 2023-11-13T13:16:36.000Z | 2023-11-13T13:16:36 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
ali444/MIS | ali444 | 2023-11-13T13:19:16Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T13:19:16Z | 2023-11-13T13:19:15.000Z | 2023-11-13T13:19:15 | ---
dataset_info:
features:
- name: passing_semester
dtype: string
- name: college_name
dtype: string
- name: prog_name
dtype: string
- name: cms_no
dtype: int64
- name: reg_no
dtype: string
- name: name
dtype: string
- name: guardian_name
dtype: string
- name: Intake_semester
dtype: string
- name: earned_cr_hrs
dtype: float64
- name: inter_max_12
dtype: float64
- name: inter_obt_12
dtype: float64
- name: inter_max_11
dtype: float64
- name: inter_obt_11
dtype: float64
splits:
- name: train
num_bytes: 625269
num_examples: 3126
download_size: 147966
dataset_size: 625269
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "MIS"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.6973899602890015,
-0.19837826490402222,
0.09905976057052612,
0.3548447787761688,
-0.12316206097602844,
-0.026232367381453514,
0.4823368191719055,
-0.37079817056655884,
0.8319689631462097,
0.4113486707210541,
-0.9553881287574768,
-0.6392162442207336,
-0.5245141386985779,
-0.0652197077870... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_postbot__distilgpt2-emailgen_public | open-llm-leaderboard | 2023-11-13T13:27:40Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T13:27:40Z | 2023-11-13T13:26:53.000Z | 2023-11-13T13:26:53 | ---
pretty_name: Evaluation run of postbot/distilgpt2-emailgen
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [postbot/distilgpt2-emailgen](https://huggingface.co/postbot/distilgpt2-emailgen)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_postbot__distilgpt2-emailgen_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-13T13:25:05.974225](https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__distilgpt2-emailgen_public/blob/main/results_2023-11-13T13-25-05.974225.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2585985031430374,\n\
\ \"acc_stderr\": 0.03091312867789808,\n \"acc_norm\": 0.2592605342225761,\n\
\ \"acc_norm_stderr\": 0.03173517189546408,\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.46170278335459186,\n\
\ \"mc2_stderr\": 0.01541047587026832,\n \"em\": 0.0,\n \"\
em_stderr\": 0.0,\n \"f1\": 0.011639052013422831,\n \"f1_stderr\"\
: 0.0006056902097790024\n },\n \"harness|arc:challenge|25\": {\n \"\
acc\": 0.18600682593856654,\n \"acc_stderr\": 0.01137094018326675,\n \
\ \"acc_norm\": 0.2175767918088737,\n \"acc_norm_stderr\": 0.012057262020972497\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2687711611232822,\n\
\ \"acc_stderr\": 0.004424146562746121,\n \"acc_norm\": 0.27524397530372435,\n\
\ \"acc_norm_stderr\": 0.004457243336616497\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677088,\n\
\ \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677088\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.02544786382510861,\n\
\ \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.02544786382510861\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n\
\ \"acc_stderr\": 0.034370793441061344,\n \"acc_norm\": 0.2152777777777778,\n\
\ \"acc_norm_stderr\": 0.034370793441061344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.044084400227680814,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.044084400227680814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.225531914893617,\n \"acc_stderr\": 0.027321078417387533,\n\
\ \"acc_norm\": 0.225531914893617,\n \"acc_norm_stderr\": 0.027321078417387533\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.04096985139843671,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.04096985139843671\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2896551724137931,\n \"acc_stderr\": 0.03780019230438014,\n\
\ \"acc_norm\": 0.2896551724137931,\n \"acc_norm_stderr\": 0.03780019230438014\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113942,\n \"\
acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113942\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
\ \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n\
\ \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n\
\ \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n\
\ \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233484,\n\
\ \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233484\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25252525252525254,\n \"acc_stderr\": 0.030954055470365904,\n \"\
acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.030954055470365904\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23834196891191708,\n \"acc_stderr\": 0.030748905363909902,\n\
\ \"acc_norm\": 0.23834196891191708,\n \"acc_norm_stderr\": 0.030748905363909902\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.22564102564102564,\n \"acc_stderr\": 0.021193632525148543,\n\
\ \"acc_norm\": 0.22564102564102564,\n \"acc_norm_stderr\": 0.021193632525148543\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.031041941304059288,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.031041941304059288\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21651376146788992,\n \"acc_stderr\": 0.017658710594443128,\n \"\
acc_norm\": 0.21651376146788992,\n \"acc_norm_stderr\": 0.017658710594443128\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25980392156862747,\n\
\ \"acc_stderr\": 0.03077855467869326,\n \"acc_norm\": 0.25980392156862747,\n\
\ \"acc_norm_stderr\": 0.03077855467869326\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.25316455696202533,\n \"acc_stderr\": 0.028304657943035307,\n\
\ \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.028304657943035307\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2600896860986547,\n\
\ \"acc_stderr\": 0.02944249558585746,\n \"acc_norm\": 0.2600896860986547,\n\
\ \"acc_norm_stderr\": 0.02944249558585746\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.03462419931615624,\n\
\ \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.03462419931615624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.042032772914677614,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.042032772914677614\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.18803418803418803,\n\
\ \"acc_stderr\": 0.02559819368665226,\n \"acc_norm\": 0.18803418803418803,\n\
\ \"acc_norm_stderr\": 0.02559819368665226\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2554278416347382,\n\
\ \"acc_stderr\": 0.015594955384455766,\n \"acc_norm\": 0.2554278416347382,\n\
\ \"acc_norm_stderr\": 0.015594955384455766\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n\
\ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.02495418432487991,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.02495418432487991\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.28938906752411575,\n\
\ \"acc_stderr\": 0.025755865922632924,\n \"acc_norm\": 0.28938906752411575,\n\
\ \"acc_norm_stderr\": 0.025755865922632924\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.025171041915309684,\n\
\ \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.025171041915309684\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.19858156028368795,\n \"acc_stderr\": 0.023798301637942106,\n \
\ \"acc_norm\": 0.19858156028368795,\n \"acc_norm_stderr\": 0.023798301637942106\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24837027379400262,\n\
\ \"acc_stderr\": 0.011035212598034501,\n \"acc_norm\": 0.24837027379400262,\n\
\ \"acc_norm_stderr\": 0.011035212598034501\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24836601307189543,\n \"acc_stderr\": 0.017479487001364764,\n \
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.017479487001364764\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n\
\ \"acc_stderr\": 0.041220665028782834,\n \"acc_norm\": 0.24545454545454545,\n\
\ \"acc_norm_stderr\": 0.041220665028782834\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2653061224489796,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.2653061224489796,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n\
\ \"acc_stderr\": 0.03076944496729602,\n \"acc_norm\": 0.2537313432835821,\n\
\ \"acc_norm_stderr\": 0.03076944496729602\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.19879518072289157,\n\
\ \"acc_stderr\": 0.031069390260789424,\n \"acc_norm\": 0.19879518072289157,\n\
\ \"acc_norm_stderr\": 0.031069390260789424\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.46170278335459186,\n\
\ \"mc2_stderr\": 0.01541047587026832\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.516179952644041,\n \"acc_stderr\": 0.014045126130978603\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0,\n \"em_stderr\"\
: 0.0,\n \"f1\": 0.011639052013422831,\n \"f1_stderr\": 0.0006056902097790024\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/postbot/distilgpt2-emailgen
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|arc:challenge|25_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|drop|3_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|gsm8k|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hellaswag|10_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T13-25-05.974225.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T13-25-05.974225.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- '**/details_harness|winogrande|5_2023-11-13T13-25-05.974225.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-13T13-25-05.974225.parquet'
- config_name: results
data_files:
- split: 2023_11_13T13_25_05.974225
path:
- results_2023-11-13T13-25-05.974225.parquet
- split: latest
path:
- results_2023-11-13T13-25-05.974225.parquet
---
# Dataset Card for Evaluation run of postbot/distilgpt2-emailgen
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/postbot/distilgpt2-emailgen
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [postbot/distilgpt2-emailgen](https://huggingface.co/postbot/distilgpt2-emailgen) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_postbot__distilgpt2-emailgen_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-13T13:25:05.974225](https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__distilgpt2-emailgen_public/blob/main/results_2023-11-13T13-25-05.974225.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2585985031430374,
"acc_stderr": 0.03091312867789808,
"acc_norm": 0.2592605342225761,
"acc_norm_stderr": 0.03173517189546408,
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.46170278335459186,
"mc2_stderr": 0.01541047587026832,
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.011639052013422831,
"f1_stderr": 0.0006056902097790024
},
"harness|arc:challenge|25": {
"acc": 0.18600682593856654,
"acc_stderr": 0.01137094018326675,
"acc_norm": 0.2175767918088737,
"acc_norm_stderr": 0.012057262020972497
},
"harness|hellaswag|10": {
"acc": 0.2687711611232822,
"acc_stderr": 0.004424146562746121,
"acc_norm": 0.27524397530372435,
"acc_norm_stderr": 0.004457243336616497
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17105263157894737,
"acc_stderr": 0.030643607071677088,
"acc_norm": 0.17105263157894737,
"acc_norm_stderr": 0.030643607071677088
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.02544786382510861,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.02544786382510861
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2152777777777778,
"acc_stderr": 0.034370793441061344,
"acc_norm": 0.2152777777777778,
"acc_norm_stderr": 0.034370793441061344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680814,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.225531914893617,
"acc_stderr": 0.027321078417387533,
"acc_norm": 0.225531914893617,
"acc_norm_stderr": 0.027321078417387533
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843671,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843671
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2896551724137931,
"acc_stderr": 0.03780019230438014,
"acc_norm": 0.2896551724137931,
"acc_norm_stderr": 0.03780019230438014
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113942,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113942
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233484,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233484
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25252525252525254,
"acc_stderr": 0.030954055470365904,
"acc_norm": 0.25252525252525254,
"acc_norm_stderr": 0.030954055470365904
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23834196891191708,
"acc_stderr": 0.030748905363909902,
"acc_norm": 0.23834196891191708,
"acc_norm_stderr": 0.030748905363909902
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.22564102564102564,
"acc_stderr": 0.021193632525148543,
"acc_norm": 0.22564102564102564,
"acc_norm_stderr": 0.021193632525148543
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.031041941304059288,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.031041941304059288
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21651376146788992,
"acc_stderr": 0.017658710594443128,
"acc_norm": 0.21651376146788992,
"acc_norm_stderr": 0.017658710594443128
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.028304657943035307,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.028304657943035307
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2600896860986547,
"acc_stderr": 0.02944249558585746,
"acc_norm": 0.2600896860986547,
"acc_norm_stderr": 0.02944249558585746
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26380368098159507,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.26380368098159507,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.042032772914677614,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.042032772914677614
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.18803418803418803,
"acc_stderr": 0.02559819368665226,
"acc_norm": 0.18803418803418803,
"acc_norm_stderr": 0.02559819368665226
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2554278416347382,
"acc_stderr": 0.015594955384455766,
"acc_norm": 0.2554278416347382,
"acc_norm_stderr": 0.015594955384455766
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.28938906752411575,
"acc_stderr": 0.025755865922632924,
"acc_norm": 0.28938906752411575,
"acc_norm_stderr": 0.025755865922632924
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.025171041915309684,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.025171041915309684
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.19858156028368795,
"acc_stderr": 0.023798301637942106,
"acc_norm": 0.19858156028368795,
"acc_norm_stderr": 0.023798301637942106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24837027379400262,
"acc_stderr": 0.011035212598034501,
"acc_norm": 0.24837027379400262,
"acc_norm_stderr": 0.011035212598034501
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.041220665028782834,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.041220665028782834
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2653061224489796,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.2653061224489796,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.03076944496729602,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.03076944496729602
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.19879518072289157,
"acc_stderr": 0.031069390260789424,
"acc_norm": 0.19879518072289157,
"acc_norm_stderr": 0.031069390260789424
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.46170278335459186,
"mc2_stderr": 0.01541047587026832
},
"harness|winogrande|5": {
"acc": 0.516179952644041,
"acc_stderr": 0.014045126130978603
},
"harness|drop|3": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.011639052013422831,
"f1_stderr": 0.0006056902097790024
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7523317337036133,
-0.8725412487983704,
0.27960360050201416,
0.22490282356739044,
-0.18606065213680267,
-0.043637365102767944,
0.0053742071613669395,
-0.21240171790122986,
0.5405378341674805,
-0.101746566593647,
-0.49575069546699524,
-0.6796361207962036,
-0.5248180031776428,
0.2428017705... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_postbot__distilgpt2-emailgen-V2_public | open-llm-leaderboard | 2023-11-13T13:31:25Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T13:31:25Z | 2023-11-13T13:30:39.000Z | 2023-11-13T13:30:39 | ---
pretty_name: Evaluation run of postbot/distilgpt2-emailgen-V2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [postbot/distilgpt2-emailgen-V2](https://huggingface.co/postbot/distilgpt2-emailgen-V2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_postbot__distilgpt2-emailgen-V2_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-13T13:28:50.616028](https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__distilgpt2-emailgen-V2_public/blob/main/results_2023-11-13T13-28-50.616028.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2542066525769912,\n\
\ \"acc_stderr\": 0.030683618404772357,\n \"acc_norm\": 0.2547326716552163,\n\
\ \"acc_norm_stderr\": 0.031502030622377816,\n \"mc1\": 0.2717258261933905,\n\
\ \"mc1_stderr\": 0.015572840452875828,\n \"mc2\": 0.4651319733972654,\n\
\ \"mc2_stderr\": 0.016103347289806055,\n \"em\": 0.0,\n \"\
em_stderr\": 0.0,\n \"f1\": 0.003143875838926175,\n \"f1_stderr\"\
: 0.00031171556932365637\n },\n \"harness|arc:challenge|25\": {\n \"\
acc\": 0.1689419795221843,\n \"acc_stderr\": 0.01094979565248503,\n \
\ \"acc_norm\": 0.2098976109215017,\n \"acc_norm_stderr\": 0.011900548748047442\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.26598287193786097,\n\
\ \"acc_stderr\": 0.004409521343140109,\n \"acc_norm\": 0.26777534355706034,\n\
\ \"acc_norm_stderr\": 0.004418948941099411\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.16447368421052633,\n \"acc_stderr\": 0.03016753346863271,\n\
\ \"acc_norm\": 0.16447368421052633,\n \"acc_norm_stderr\": 0.03016753346863271\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.22641509433962265,\n \"acc_stderr\": 0.025757559893106744,\n\
\ \"acc_norm\": 0.22641509433962265,\n \"acc_norm_stderr\": 0.025757559893106744\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n\
\ \"acc_stderr\": 0.0321473730202947,\n \"acc_norm\": 0.23121387283236994,\n\
\ \"acc_norm_stderr\": 0.0321473730202947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087785,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.15,\n \"acc_stderr\": 0.035887028128263714,\n \"acc_norm\": 0.15,\n\
\ \"acc_norm_stderr\": 0.035887028128263714\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.23829787234042554,\n \"acc_stderr\": 0.02785125297388979,\n\
\ \"acc_norm\": 0.23829787234042554,\n \"acc_norm_stderr\": 0.02785125297388979\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924811,\n\
\ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924811\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15873015873015872,\n\
\ \"acc_stderr\": 0.03268454013011743,\n \"acc_norm\": 0.15873015873015872,\n\
\ \"acc_norm_stderr\": 0.03268454013011743\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n\
\ \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n\
\ \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.20606060606060606,\n \"acc_stderr\": 0.03158415324047707,\n\
\ \"acc_norm\": 0.20606060606060606,\n \"acc_norm_stderr\": 0.03158415324047707\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35858585858585856,\n \"acc_stderr\": 0.03416903640391521,\n \"\
acc_norm\": 0.35858585858585856,\n \"acc_norm_stderr\": 0.03416903640391521\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178256,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178256\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128013,\n\
\ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128013\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.02665353159671548,\n\
\ \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.02665353159671548\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473835,\n \"\
acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473835\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21467889908256882,\n \"acc_stderr\": 0.01760430414925648,\n \"\
acc_norm\": 0.21467889908256882,\n \"acc_norm_stderr\": 0.01760430414925648\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2062780269058296,\n\
\ \"acc_stderr\": 0.027157150479563824,\n \"acc_norm\": 0.2062780269058296,\n\
\ \"acc_norm_stderr\": 0.027157150479563824\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952687,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952687\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.20512820512820512,\n\
\ \"acc_stderr\": 0.02645350805404035,\n \"acc_norm\": 0.20512820512820512,\n\
\ \"acc_norm_stderr\": 0.02645350805404035\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2656449553001277,\n\
\ \"acc_stderr\": 0.015794302487888726,\n \"acc_norm\": 0.2656449553001277,\n\
\ \"acc_norm_stderr\": 0.015794302487888726\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.02207570925175717,\n\
\ \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.02207570925175717\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.238562091503268,\n \"acc_stderr\": 0.024404394928087873,\n\
\ \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.024404394928087873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02540719779889016,\n\
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02540719779889016\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432407,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432407\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n\
\ \"acc_stderr\": 0.010896123652676651,\n \"acc_norm\": 0.2392438070404172,\n\
\ \"acc_norm_stderr\": 0.010896123652676651\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4264705882352941,\n \"acc_stderr\": 0.030042615832714854,\n\
\ \"acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.030042615832714854\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26633986928104575,\n \"acc_stderr\": 0.017883188134667178,\n \
\ \"acc_norm\": 0.26633986928104575,\n \"acc_norm_stderr\": 0.017883188134667178\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2163265306122449,\n \"acc_stderr\": 0.026358916334904035,\n\
\ \"acc_norm\": 0.2163265306122449,\n \"acc_norm_stderr\": 0.026358916334904035\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.29518072289156627,\n\
\ \"acc_stderr\": 0.035509201856896294,\n \"acc_norm\": 0.29518072289156627,\n\
\ \"acc_norm_stderr\": 0.035509201856896294\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2717258261933905,\n\
\ \"mc1_stderr\": 0.015572840452875828,\n \"mc2\": 0.4651319733972654,\n\
\ \"mc2_stderr\": 0.016103347289806055\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5201262825572218,\n \"acc_stderr\": 0.01404109666434433\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0,\n \"em_stderr\"\
: 0.0,\n \"f1\": 0.003143875838926175,\n \"f1_stderr\": 0.00031171556932365637\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/postbot/distilgpt2-emailgen-V2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|arc:challenge|25_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|drop|3_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|gsm8k|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hellaswag|10_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|winogrande|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-13T13-28-50.616028.parquet'
- config_name: results
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- results_2023-11-13T13-28-50.616028.parquet
- split: latest
path:
- results_2023-11-13T13-28-50.616028.parquet
---
# Dataset Card for Evaluation run of postbot/distilgpt2-emailgen-V2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/postbot/distilgpt2-emailgen-V2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [postbot/distilgpt2-emailgen-V2](https://huggingface.co/postbot/distilgpt2-emailgen-V2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_postbot__distilgpt2-emailgen-V2_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-13T13:28:50.616028](https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__distilgpt2-emailgen-V2_public/blob/main/results_2023-11-13T13-28-50.616028.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2542066525769912,
"acc_stderr": 0.030683618404772357,
"acc_norm": 0.2547326716552163,
"acc_norm_stderr": 0.031502030622377816,
"mc1": 0.2717258261933905,
"mc1_stderr": 0.015572840452875828,
"mc2": 0.4651319733972654,
"mc2_stderr": 0.016103347289806055,
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.003143875838926175,
"f1_stderr": 0.00031171556932365637
},
"harness|arc:challenge|25": {
"acc": 0.1689419795221843,
"acc_stderr": 0.01094979565248503,
"acc_norm": 0.2098976109215017,
"acc_norm_stderr": 0.011900548748047442
},
"harness|hellaswag|10": {
"acc": 0.26598287193786097,
"acc_stderr": 0.004409521343140109,
"acc_norm": 0.26777534355706034,
"acc_norm_stderr": 0.004418948941099411
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.16447368421052633,
"acc_stderr": 0.03016753346863271,
"acc_norm": 0.16447368421052633,
"acc_norm_stderr": 0.03016753346863271
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22641509433962265,
"acc_stderr": 0.025757559893106744,
"acc_norm": 0.22641509433962265,
"acc_norm_stderr": 0.025757559893106744
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.0321473730202947,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.0321473730202947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087785,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.15,
"acc_stderr": 0.035887028128263714,
"acc_norm": 0.15,
"acc_norm_stderr": 0.035887028128263714
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.23829787234042554,
"acc_stderr": 0.02785125297388979,
"acc_norm": 0.23829787234042554,
"acc_norm_stderr": 0.02785125297388979
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924811,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924811
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15873015873015872,
"acc_stderr": 0.03268454013011743,
"acc_norm": 0.15873015873015872,
"acc_norm_stderr": 0.03268454013011743
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.20606060606060606,
"acc_stderr": 0.03158415324047707,
"acc_norm": 0.20606060606060606,
"acc_norm_stderr": 0.03158415324047707
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35858585858585856,
"acc_stderr": 0.03416903640391521,
"acc_norm": 0.35858585858585856,
"acc_norm_stderr": 0.03416903640391521
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178256,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178256
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722128013,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722128013
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.02665353159671548,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.02665353159671548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473835,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473835
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21467889908256882,
"acc_stderr": 0.01760430414925648,
"acc_norm": 0.21467889908256882,
"acc_norm_stderr": 0.01760430414925648
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2062780269058296,
"acc_stderr": 0.027157150479563824,
"acc_norm": 0.2062780269058296,
"acc_norm_stderr": 0.027157150479563824
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952687,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952687
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.20512820512820512,
"acc_stderr": 0.02645350805404035,
"acc_norm": 0.20512820512820512,
"acc_norm_stderr": 0.02645350805404035
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2656449553001277,
"acc_stderr": 0.015794302487888726,
"acc_norm": 0.2656449553001277,
"acc_norm_stderr": 0.015794302487888726
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.02207570925175717,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.02207570925175717
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.238562091503268,
"acc_stderr": 0.024404394928087873,
"acc_norm": 0.238562091503268,
"acc_norm_stderr": 0.024404394928087873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02540719779889016,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02540719779889016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432407,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676651,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676651
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4264705882352941,
"acc_stderr": 0.030042615832714854,
"acc_norm": 0.4264705882352941,
"acc_norm_stderr": 0.030042615832714854
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26633986928104575,
"acc_stderr": 0.017883188134667178,
"acc_norm": 0.26633986928104575,
"acc_norm_stderr": 0.017883188134667178
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2163265306122449,
"acc_stderr": 0.026358916334904035,
"acc_norm": 0.2163265306122449,
"acc_norm_stderr": 0.026358916334904035
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.29518072289156627,
"acc_stderr": 0.035509201856896294,
"acc_norm": 0.29518072289156627,
"acc_norm_stderr": 0.035509201856896294
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2717258261933905,
"mc1_stderr": 0.015572840452875828,
"mc2": 0.4651319733972654,
"mc2_stderr": 0.016103347289806055
},
"harness|winogrande|5": {
"acc": 0.5201262825572218,
"acc_stderr": 0.01404109666434433
},
"harness|drop|3": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.003143875838926175,
"f1_stderr": 0.00031171556932365637
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.737996518611908,
-0.8424129486083984,
0.29306820034980774,
0.22721686959266663,
-0.19489075243473053,
-0.029187137261033058,
0.002133350120857358,
-0.21588793396949768,
0.5304617881774902,
-0.09917899966239929,
-0.5075585246086121,
-0.6768341660499573,
-0.5385795831680298,
0.24680307507... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
nlplabtdtu/OpenOrca-solution-for-a-goal-vi | nlplabtdtu | 2023-11-13T13:34:18Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T13:34:18Z | 2023-11-13T13:33:50.000Z | 2023-11-13T13:33:50 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
nlplabtdtu/OpenOrca-2-fact-vi | nlplabtdtu | 2023-11-13T13:35:07Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T13:35:07Z | 2023-11-13T13:34:56.000Z | 2023-11-13T13:34:56 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
nlplabtdtu/OpenOrca-conclusion-condition-vi | nlplabtdtu | 2023-11-13T13:36:10Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T13:36:10Z | 2023-11-13T13:35:56.000Z | 2023-11-13T13:35:56 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
SinKove/synthetic_liver_tumor_CT | SinKove | 2023-11-13T13:36:25Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-13T13:36:25Z | 2023-11-13T13:36:25.000Z | 2023-11-13T13:36:25 | ---
license: openrail
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
PrevenIA/silver_standard_preguntas_suicidio_dataset | PrevenIA | 2023-11-13T13:51:10Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T13:51:10Z | 2023-11-13T13:46:51.000Z | 2023-11-13T13:46:51 | Entry not found | [
-0.3227647542953491,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965083122253,
0.7915717959403992,
0.07618629932403564,
0.7746022343635559,
0.2563222348690033,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
lwface/sd-configs-1.5 | lwface | 2023-11-13T16:14:11Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-13T16:14:11Z | 2023-11-13T14:25:07.000Z | 2023-11-13T14:25:07 | ---
license: mit
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
tollefj/stsbenchmark-sts-NOB | tollefj | 2023-11-13T14:30:53Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T14:30:53Z | 2023-11-13T14:30:48.000Z | 2023-11-13T14:30:48 | # Translated STS dataset to Norwegian Bokmål
Machine translated using the *No language left behind* model series, specifically the 1.3B variant: https://huggingface.co/facebook/nllb-200-distilled-1.3B
| [
-0.1628946214914322,
-0.5689592957496643,
0.43053141236305237,
0.5253563523292542,
-0.8773573040962219,
0.3225034177303314,
0.2025361955165863,
-0.2466166615486145,
0.3589381277561188,
0.8461148738861084,
-0.7320520281791687,
-0.7031416893005371,
-0.7817871570587158,
0.17299973964691162,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
aqeel1215/e-commerce-products | aqeel1215 | 2023-11-13T14:38:30Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-13T14:38:30Z | 2023-11-13T14:38:30.000Z | 2023-11-13T14:38:30 | ---
license: mit
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_L-R__LLmRa-2.7B_public | open-llm-leaderboard | 2023-11-13T14:55:35Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T14:55:35Z | 2023-11-13T14:54:50.000Z | 2023-11-13T14:54:50 | ---
pretty_name: Evaluation run of L-R/LLmRa-2.7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [L-R/LLmRa-2.7B](https://huggingface.co/L-R/LLmRa-2.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_L-R__LLmRa-2.7B_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-13T14:52:35.782186](https://huggingface.co/datasets/open-llm-leaderboard/details_L-R__LLmRa-2.7B_public/blob/main/results_2023-11-13T14-52-35.782186.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2619182180653927,\n\
\ \"acc_stderr\": 0.031054877346083407,\n \"acc_norm\": 0.2636967484818349,\n\
\ \"acc_norm_stderr\": 0.031856551298856575,\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602581,\n \"mc2\": 0.3522535522108365,\n\
\ \"mc2_stderr\": 0.01379814047299605,\n \"em\": 0.0009437919463087249,\n\
\ \"em_stderr\": 0.0003144653119413285,\n \"f1\": 0.04760067114093977,\n\
\ \"f1_stderr\": 0.0011764663842453984\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.32081911262798635,\n \"acc_stderr\": 0.013640943091946526,\n\
\ \"acc_norm\": 0.3703071672354949,\n \"acc_norm_stderr\": 0.01411129875167495\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4561840270862378,\n\
\ \"acc_stderr\": 0.004970585328297622,\n \"acc_norm\": 0.6064528978291177,\n\
\ \"acc_norm_stderr\": 0.0048753793520798245\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.033176727875331574,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.033176727875331574\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.22264150943396227,\n \"acc_stderr\": 0.0256042334708991,\n\
\ \"acc_norm\": 0.22264150943396227,\n \"acc_norm_stderr\": 0.0256042334708991\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.028020226271200217,\n\
\ \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.028020226271200217\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436695,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436695\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309993,\n\
\ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309993\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23015873015873015,\n \"acc_stderr\": 0.021679219663693135,\n \"\
acc_norm\": 0.23015873015873015,\n \"acc_norm_stderr\": 0.021679219663693135\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
\ \"acc_stderr\": 0.03200686497287394,\n \"acc_norm\": 0.15079365079365079,\n\
\ \"acc_norm_stderr\": 0.03200686497287394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.19032258064516128,\n \"acc_stderr\": 0.02233170761182307,\n \"\
acc_norm\": 0.19032258064516128,\n \"acc_norm_stderr\": 0.02233170761182307\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3054187192118227,\n \"acc_stderr\": 0.03240661565868408,\n \"\
acc_norm\": 0.3054187192118227,\n \"acc_norm_stderr\": 0.03240661565868408\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21212121212121213,\n \"acc_stderr\": 0.029126522834586804,\n \"\
acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.029126522834586804\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752943,\n\
\ \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752943\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2358974358974359,\n \"acc_stderr\": 0.021525965407408726,\n\
\ \"acc_norm\": 0.2358974358974359,\n \"acc_norm_stderr\": 0.021525965407408726\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868956,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868956\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.03543304234389985,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.03543304234389985\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23302752293577983,\n \"acc_stderr\": 0.018125669180861507,\n \"\
acc_norm\": 0.23302752293577983,\n \"acc_norm_stderr\": 0.018125669180861507\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2175925925925926,\n \"acc_stderr\": 0.028139689444859693,\n \"\
acc_norm\": 0.2175925925925926,\n \"acc_norm_stderr\": 0.028139689444859693\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23039215686274508,\n \"acc_stderr\": 0.029554292605695066,\n \"\
acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.029554292605695066\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.29535864978902954,\n \"acc_stderr\": 0.029696338713422893,\n \
\ \"acc_norm\": 0.29535864978902954,\n \"acc_norm_stderr\": 0.029696338713422893\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.21076233183856502,\n\
\ \"acc_stderr\": 0.02737309550054019,\n \"acc_norm\": 0.21076233183856502,\n\
\ \"acc_norm_stderr\": 0.02737309550054019\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2822085889570552,\n \"acc_stderr\": 0.03536117886664743,\n\
\ \"acc_norm\": 0.2822085889570552,\n \"acc_norm_stderr\": 0.03536117886664743\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.33980582524271846,\n \"acc_stderr\": 0.046897659372781335,\n\
\ \"acc_norm\": 0.33980582524271846,\n \"acc_norm_stderr\": 0.046897659372781335\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23076923076923078,\n\
\ \"acc_stderr\": 0.027601921381417614,\n \"acc_norm\": 0.23076923076923078,\n\
\ \"acc_norm_stderr\": 0.027601921381417614\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26309067688378035,\n\
\ \"acc_stderr\": 0.015745497169049046,\n \"acc_norm\": 0.26309067688378035,\n\
\ \"acc_norm_stderr\": 0.015745497169049046\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.023176298203992002,\n\
\ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.023176298203992002\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.0248480182638752,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.0248480182638752\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3440514469453376,\n\
\ \"acc_stderr\": 0.026981478043648022,\n \"acc_norm\": 0.3440514469453376,\n\
\ \"acc_norm_stderr\": 0.026981478043648022\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.20987654320987653,\n \"acc_stderr\": 0.02265834408598136,\n\
\ \"acc_norm\": 0.20987654320987653,\n \"acc_norm_stderr\": 0.02265834408598136\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590634,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590634\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24445893089960888,\n\
\ \"acc_stderr\": 0.010976425013113899,\n \"acc_norm\": 0.24445893089960888,\n\
\ \"acc_norm_stderr\": 0.010976425013113899\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.19117647058823528,\n \"acc_stderr\": 0.023886881922440345,\n\
\ \"acc_norm\": 0.19117647058823528,\n \"acc_norm_stderr\": 0.023886881922440345\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26143790849673204,\n \"acc_stderr\": 0.017776947157528044,\n \
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.017776947157528044\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.36363636363636365,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.36363636363636365,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.025409301953225678,\n\
\ \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.025409301953225678\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.29850746268656714,\n\
\ \"acc_stderr\": 0.03235743789355042,\n \"acc_norm\": 0.29850746268656714,\n\
\ \"acc_norm_stderr\": 0.03235743789355042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25301204819277107,\n\
\ \"acc_stderr\": 0.033844291552331346,\n \"acc_norm\": 0.25301204819277107,\n\
\ \"acc_norm_stderr\": 0.033844291552331346\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602581,\n \"mc2\": 0.3522535522108365,\n\
\ \"mc2_stderr\": 0.01379814047299605\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6156274664561957,\n \"acc_stderr\": 0.01367156760083619\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0009437919463087249,\n \
\ \"em_stderr\": 0.0003144653119413285,\n \"f1\": 0.04760067114093977,\n\
\ \"f1_stderr\": 0.0011764663842453984\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.003032600454890068,\n \"acc_stderr\": 0.0015145735612245427\n\
\ }\n}\n```"
repo_url: https://huggingface.co/L-R/LLmRa-2.7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|arc:challenge|25_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|drop|3_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|gsm8k|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hellaswag|10_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|winogrande|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-13T14-52-35.782186.parquet'
- config_name: results
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- results_2023-11-13T14-52-35.782186.parquet
- split: latest
path:
- results_2023-11-13T14-52-35.782186.parquet
---
# Dataset Card for Evaluation run of L-R/LLmRa-2.7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/L-R/LLmRa-2.7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [L-R/LLmRa-2.7B](https://huggingface.co/L-R/LLmRa-2.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_L-R__LLmRa-2.7B_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-13T14:52:35.782186](https://huggingface.co/datasets/open-llm-leaderboard/details_L-R__LLmRa-2.7B_public/blob/main/results_2023-11-13T14-52-35.782186.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2619182180653927,
"acc_stderr": 0.031054877346083407,
"acc_norm": 0.2636967484818349,
"acc_norm_stderr": 0.031856551298856575,
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602581,
"mc2": 0.3522535522108365,
"mc2_stderr": 0.01379814047299605,
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413285,
"f1": 0.04760067114093977,
"f1_stderr": 0.0011764663842453984
},
"harness|arc:challenge|25": {
"acc": 0.32081911262798635,
"acc_stderr": 0.013640943091946526,
"acc_norm": 0.3703071672354949,
"acc_norm_stderr": 0.01411129875167495
},
"harness|hellaswag|10": {
"acc": 0.4561840270862378,
"acc_stderr": 0.004970585328297622,
"acc_norm": 0.6064528978291177,
"acc_norm_stderr": 0.0048753793520798245
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.033176727875331574,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.033176727875331574
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22264150943396227,
"acc_stderr": 0.0256042334708991,
"acc_norm": 0.22264150943396227,
"acc_norm_stderr": 0.0256042334708991
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.028020226271200217,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.028020226271200217
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436695,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436695
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.03806142687309993,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.03806142687309993
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.021679219663693135,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.021679219663693135
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287394,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.19032258064516128,
"acc_stderr": 0.02233170761182307,
"acc_norm": 0.19032258064516128,
"acc_norm_stderr": 0.02233170761182307
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3054187192118227,
"acc_stderr": 0.03240661565868408,
"acc_norm": 0.3054187192118227,
"acc_norm_stderr": 0.03240661565868408
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.029126522834586804,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.029126522834586804
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752943,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752943
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2358974358974359,
"acc_stderr": 0.021525965407408726,
"acc_norm": 0.2358974358974359,
"acc_norm_stderr": 0.021525965407408726
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868956,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.03543304234389985,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.03543304234389985
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23302752293577983,
"acc_stderr": 0.018125669180861507,
"acc_norm": 0.23302752293577983,
"acc_norm_stderr": 0.018125669180861507
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2175925925925926,
"acc_stderr": 0.028139689444859693,
"acc_norm": 0.2175925925925926,
"acc_norm_stderr": 0.028139689444859693
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23039215686274508,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.23039215686274508,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.29535864978902954,
"acc_stderr": 0.029696338713422893,
"acc_norm": 0.29535864978902954,
"acc_norm_stderr": 0.029696338713422893
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.21076233183856502,
"acc_stderr": 0.02737309550054019,
"acc_norm": 0.21076233183856502,
"acc_norm_stderr": 0.02737309550054019
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2822085889570552,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.2822085889570552,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.33980582524271846,
"acc_stderr": 0.046897659372781335,
"acc_norm": 0.33980582524271846,
"acc_norm_stderr": 0.046897659372781335
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23076923076923078,
"acc_stderr": 0.027601921381417614,
"acc_norm": 0.23076923076923078,
"acc_norm_stderr": 0.027601921381417614
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26309067688378035,
"acc_stderr": 0.015745497169049046,
"acc_norm": 0.26309067688378035,
"acc_norm_stderr": 0.015745497169049046
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.023176298203992002,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.023176298203992002
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.0248480182638752,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.0248480182638752
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3440514469453376,
"acc_stderr": 0.026981478043648022,
"acc_norm": 0.3440514469453376,
"acc_norm_stderr": 0.026981478043648022
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.20987654320987653,
"acc_stderr": 0.02265834408598136,
"acc_norm": 0.20987654320987653,
"acc_norm_stderr": 0.02265834408598136
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590634,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590634
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24445893089960888,
"acc_stderr": 0.010976425013113899,
"acc_norm": 0.24445893089960888,
"acc_norm_stderr": 0.010976425013113899
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19117647058823528,
"acc_stderr": 0.023886881922440345,
"acc_norm": 0.19117647058823528,
"acc_norm_stderr": 0.023886881922440345
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.017776947157528044,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.017776947157528044
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19591836734693877,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.19591836734693877,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.29850746268656714,
"acc_stderr": 0.03235743789355042,
"acc_norm": 0.29850746268656714,
"acc_norm_stderr": 0.03235743789355042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25301204819277107,
"acc_stderr": 0.033844291552331346,
"acc_norm": 0.25301204819277107,
"acc_norm_stderr": 0.033844291552331346
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602581,
"mc2": 0.3522535522108365,
"mc2_stderr": 0.01379814047299605
},
"harness|winogrande|5": {
"acc": 0.6156274664561957,
"acc_stderr": 0.01367156760083619
},
"harness|drop|3": {
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413285,
"f1": 0.04760067114093977,
"f1_stderr": 0.0011764663842453984
},
"harness|gsm8k|5": {
"acc": 0.003032600454890068,
"acc_stderr": 0.0015145735612245427
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7211433053016663,
-0.8571552634239197,
0.288064181804657,
0.2000463306903839,
-0.17322292923927307,
-0.018177403137087822,
0.028445525094866753,
-0.2467227280139923,
0.6080135703086853,
-0.06478695571422577,
-0.49208685755729675,
-0.6722737550735474,
-0.44652947783470154,
0.231171175837... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
mesolitica/translated-glaive-code-assistant-v2 | mesolitica | 2023-11-13T15:05:07Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T15:05:07Z | 2023-11-13T15:03:41.000Z | 2023-11-13T15:03:41 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
sreejith8100/death_marriage_data | sreejith8100 | 2023-11-13T15:17:23Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T15:17:23Z | 2023-11-13T15:14:28.000Z | 2023-11-13T15:14:28 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': death
'1': marriage
splits:
- name: train
num_bytes: 579589900.0
num_examples: 448
- name: test
num_bytes: 13589304.0
num_examples: 20
download_size: 593212683
dataset_size: 593179204.0
---
# Dataset Card for "death_marriage_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.5123040676116943,
-0.2512423098087311,
0.3580423891544342,
0.3292447328567505,
-0.46615755558013916,
-0.16454675793647766,
0.34878677129745483,
0.1574164628982544,
0.7428018450737,
0.48971909284591675,
-0.7370783090591431,
-0.8848146200180054,
-0.5360028147697449,
-0.45234760642051697,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Trino123/lex-friedman-chunked | Trino123 | 2023-11-14T09:01:59Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-14T09:01:59Z | 2023-11-13T15:24:41.000Z | 2023-11-13T15:24:41 | ---
license: mit
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_postbot__emailgen-pythia-410m-deduped_public | open-llm-leaderboard | 2023-11-13T15:27:36Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T15:27:36Z | 2023-11-13T15:26:49.000Z | 2023-11-13T15:26:49 | ---
pretty_name: Evaluation run of postbot/emailgen-pythia-410m-deduped
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [postbot/emailgen-pythia-410m-deduped](https://huggingface.co/postbot/emailgen-pythia-410m-deduped)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_postbot__emailgen-pythia-410m-deduped_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-13T15:24:35.622872](https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__emailgen-pythia-410m-deduped_public/blob/main/results_2023-11-13T15-24-35.622872.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2739821268942055,\n\
\ \"acc_stderr\": 0.031358822799769724,\n \"acc_norm\": 0.2757926465489037,\n\
\ \"acc_norm_stderr\": 0.03219166127988676,\n \"mc1\": 0.22276621787025705,\n\
\ \"mc1_stderr\": 0.01456650696139673,\n \"mc2\": 0.3819742528315203,\n\
\ \"mc2_stderr\": 0.015246089965112817,\n \"em\": 0.00020973154362416107,\n\
\ \"em_stderr\": 0.00014829481977280738,\n \"f1\": 0.009905620805369138,\n\
\ \"f1_stderr\": 0.0005041998138971091\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2593856655290102,\n \"acc_stderr\": 0.012808273573927102,\n\
\ \"acc_norm\": 0.2790102389078498,\n \"acc_norm_stderr\": 0.013106784883601333\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.34027086237801235,\n\
\ \"acc_stderr\": 0.004728318577835236,\n \"acc_norm\": 0.4004182433778132,\n\
\ \"acc_norm_stderr\": 0.00488981748973969\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.037498507091740234,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.037498507091740234\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.33584905660377357,\n \"acc_stderr\": 0.029067220146644826,\n\
\ \"acc_norm\": 0.33584905660377357,\n \"acc_norm_stderr\": 0.029067220146644826\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.036539469694421,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.036539469694421\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001976,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001976\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2658959537572254,\n\
\ \"acc_stderr\": 0.033687629322594316,\n \"acc_norm\": 0.2658959537572254,\n\
\ \"acc_norm_stderr\": 0.033687629322594316\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2978723404255319,\n \"acc_stderr\": 0.029896145682095455,\n\
\ \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.029896145682095455\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.034559302019248096,\n\
\ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.034559302019248096\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643895,\n \"\
acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643895\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557836,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557836\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22903225806451613,\n\
\ \"acc_stderr\": 0.02390491431178265,\n \"acc_norm\": 0.22903225806451613,\n\
\ \"acc_norm_stderr\": 0.02390491431178265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.031447125816782426,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.031447125816782426\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3181818181818182,\n \"acc_stderr\": 0.03318477333845331,\n \"\
acc_norm\": 0.3181818181818182,\n \"acc_norm_stderr\": 0.03318477333845331\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.35751295336787564,\n \"acc_stderr\": 0.034588160421810045,\n\
\ \"acc_norm\": 0.35751295336787564,\n \"acc_norm_stderr\": 0.034588160421810045\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.36153846153846153,\n \"acc_stderr\": 0.024359581465396987,\n\
\ \"acc_norm\": 0.36153846153846153,\n \"acc_norm_stderr\": 0.024359581465396987\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3445378151260504,\n \"acc_stderr\": 0.03086868260412163,\n \
\ \"acc_norm\": 0.3445378151260504,\n \"acc_norm_stderr\": 0.03086868260412163\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.344954128440367,\n \"acc_stderr\": 0.02038060540506697,\n \"acc_norm\"\
: 0.344954128440367,\n \"acc_norm_stderr\": 0.02038060540506697\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.033622774366080424,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.033622774366080424\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.03058759135160425,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.03058759135160425\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2109704641350211,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.2109704641350211,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.12556053811659193,\n\
\ \"acc_stderr\": 0.022238985469323774,\n \"acc_norm\": 0.12556053811659193,\n\
\ \"acc_norm_stderr\": 0.022238985469323774\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.366412213740458,\n \"acc_stderr\": 0.04225875451969638,\n\
\ \"acc_norm\": 0.366412213740458,\n \"acc_norm_stderr\": 0.04225875451969638\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.23140495867768596,\n \"acc_stderr\": 0.0384985609879409,\n \"\
acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.0384985609879409\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252628,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.15178571428571427,\n\
\ \"acc_stderr\": 0.034057028381856945,\n \"acc_norm\": 0.15178571428571427,\n\
\ \"acc_norm_stderr\": 0.034057028381856945\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.36893203883495146,\n \"acc_stderr\": 0.047776151811567386,\n\
\ \"acc_norm\": 0.36893203883495146,\n \"acc_norm_stderr\": 0.047776151811567386\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.21367521367521367,\n\
\ \"acc_stderr\": 0.026853450377009154,\n \"acc_norm\": 0.21367521367521367,\n\
\ \"acc_norm_stderr\": 0.026853450377009154\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.22988505747126436,\n\
\ \"acc_stderr\": 0.015046301846691807,\n \"acc_norm\": 0.22988505747126436,\n\
\ \"acc_norm_stderr\": 0.015046301846691807\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.21098265895953758,\n \"acc_stderr\": 0.021966309947043117,\n\
\ \"acc_norm\": 0.21098265895953758,\n \"acc_norm_stderr\": 0.021966309947043117\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27150837988826815,\n\
\ \"acc_stderr\": 0.014874252168095273,\n \"acc_norm\": 0.27150837988826815,\n\
\ \"acc_norm_stderr\": 0.014874252168095273\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.025261691219729498,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.025261691219729498\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2347266881028939,\n\
\ \"acc_stderr\": 0.024071805887677045,\n \"acc_norm\": 0.2347266881028939,\n\
\ \"acc_norm_stderr\": 0.024071805887677045\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2345679012345679,\n \"acc_stderr\": 0.023576881744005705,\n\
\ \"acc_norm\": 0.2345679012345679,\n \"acc_norm_stderr\": 0.023576881744005705\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24113475177304963,\n \"acc_stderr\": 0.02551873104953776,\n \
\ \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.02551873104953776\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25097783572359844,\n\
\ \"acc_stderr\": 0.01107373029918723,\n \"acc_norm\": 0.25097783572359844,\n\
\ \"acc_norm_stderr\": 0.01107373029918723\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4227941176470588,\n \"acc_stderr\": 0.030008562845003476,\n\
\ \"acc_norm\": 0.4227941176470588,\n \"acc_norm_stderr\": 0.030008562845003476\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24183006535947713,\n \"acc_stderr\": 0.017322789207784326,\n \
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.017322789207784326\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.19090909090909092,\n\
\ \"acc_stderr\": 0.03764425585984926,\n \"acc_norm\": 0.19090909090909092,\n\
\ \"acc_norm_stderr\": 0.03764425585984926\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n\
\ \"acc_stderr\": 0.030769444967296028,\n \"acc_norm\": 0.2537313432835821,\n\
\ \"acc_norm_stderr\": 0.030769444967296028\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25301204819277107,\n\
\ \"acc_stderr\": 0.033844291552331346,\n \"acc_norm\": 0.25301204819277107,\n\
\ \"acc_norm_stderr\": 0.033844291552331346\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22276621787025705,\n\
\ \"mc1_stderr\": 0.01456650696139673,\n \"mc2\": 0.3819742528315203,\n\
\ \"mc2_stderr\": 0.015246089965112817\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5209155485398579,\n \"acc_stderr\": 0.014040185494212947\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.00020973154362416107,\n \
\ \"em_stderr\": 0.00014829481977280738,\n \"f1\": 0.009905620805369138,\n\
\ \"f1_stderr\": 0.0005041998138971091\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/postbot/emailgen-pythia-410m-deduped
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|arc:challenge|25_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|drop|3_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|gsm8k|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hellaswag|10_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|winogrande|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-13T15-24-35.622872.parquet'
- config_name: results
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- results_2023-11-13T15-24-35.622872.parquet
- split: latest
path:
- results_2023-11-13T15-24-35.622872.parquet
---
# Dataset Card for Evaluation run of postbot/emailgen-pythia-410m-deduped
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/postbot/emailgen-pythia-410m-deduped
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [postbot/emailgen-pythia-410m-deduped](https://huggingface.co/postbot/emailgen-pythia-410m-deduped) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_postbot__emailgen-pythia-410m-deduped_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-13T15:24:35.622872](https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__emailgen-pythia-410m-deduped_public/blob/main/results_2023-11-13T15-24-35.622872.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2739821268942055,
"acc_stderr": 0.031358822799769724,
"acc_norm": 0.2757926465489037,
"acc_norm_stderr": 0.03219166127988676,
"mc1": 0.22276621787025705,
"mc1_stderr": 0.01456650696139673,
"mc2": 0.3819742528315203,
"mc2_stderr": 0.015246089965112817,
"em": 0.00020973154362416107,
"em_stderr": 0.00014829481977280738,
"f1": 0.009905620805369138,
"f1_stderr": 0.0005041998138971091
},
"harness|arc:challenge|25": {
"acc": 0.2593856655290102,
"acc_stderr": 0.012808273573927102,
"acc_norm": 0.2790102389078498,
"acc_norm_stderr": 0.013106784883601333
},
"harness|hellaswag|10": {
"acc": 0.34027086237801235,
"acc_stderr": 0.004728318577835236,
"acc_norm": 0.4004182433778132,
"acc_norm_stderr": 0.00488981748973969
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.037498507091740234,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.037498507091740234
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.33584905660377357,
"acc_stderr": 0.029067220146644826,
"acc_norm": 0.33584905660377357,
"acc_norm_stderr": 0.029067220146644826
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.036539469694421,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.036539469694421
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001976,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001976
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.033687629322594316,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.033687629322594316
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2978723404255319,
"acc_stderr": 0.029896145682095455,
"acc_norm": 0.2978723404255319,
"acc_norm_stderr": 0.029896145682095455
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.034559302019248096,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.034559302019248096
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643895,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643895
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557836,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557836
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22903225806451613,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.22903225806451613,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.031447125816782426,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.031447125816782426
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.03318477333845331,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.03318477333845331
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.35751295336787564,
"acc_stderr": 0.034588160421810045,
"acc_norm": 0.35751295336787564,
"acc_norm_stderr": 0.034588160421810045
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36153846153846153,
"acc_stderr": 0.024359581465396987,
"acc_norm": 0.36153846153846153,
"acc_norm_stderr": 0.024359581465396987
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3445378151260504,
"acc_stderr": 0.03086868260412163,
"acc_norm": 0.3445378151260504,
"acc_norm_stderr": 0.03086868260412163
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.344954128440367,
"acc_stderr": 0.02038060540506697,
"acc_norm": 0.344954128440367,
"acc_norm_stderr": 0.02038060540506697
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.033622774366080424,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.033622774366080424
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.03058759135160425,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.03058759135160425
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2109704641350211,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.2109704641350211,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.12556053811659193,
"acc_stderr": 0.022238985469323774,
"acc_norm": 0.12556053811659193,
"acc_norm_stderr": 0.022238985469323774
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.366412213740458,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.366412213740458,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.0384985609879409,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.0384985609879409
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252628,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25766871165644173,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.25766871165644173,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.15178571428571427,
"acc_stderr": 0.034057028381856945,
"acc_norm": 0.15178571428571427,
"acc_norm_stderr": 0.034057028381856945
},
"harness|hendrycksTest-management|5": {
"acc": 0.36893203883495146,
"acc_stderr": 0.047776151811567386,
"acc_norm": 0.36893203883495146,
"acc_norm_stderr": 0.047776151811567386
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.21367521367521367,
"acc_stderr": 0.026853450377009154,
"acc_norm": 0.21367521367521367,
"acc_norm_stderr": 0.026853450377009154
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.22988505747126436,
"acc_stderr": 0.015046301846691807,
"acc_norm": 0.22988505747126436,
"acc_norm_stderr": 0.015046301846691807
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.21098265895953758,
"acc_stderr": 0.021966309947043117,
"acc_norm": 0.21098265895953758,
"acc_norm_stderr": 0.021966309947043117
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27150837988826815,
"acc_stderr": 0.014874252168095273,
"acc_norm": 0.27150837988826815,
"acc_norm_stderr": 0.014874252168095273
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.025261691219729498,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.025261691219729498
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2347266881028939,
"acc_stderr": 0.024071805887677045,
"acc_norm": 0.2347266881028939,
"acc_norm_stderr": 0.024071805887677045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2345679012345679,
"acc_stderr": 0.023576881744005705,
"acc_norm": 0.2345679012345679,
"acc_norm_stderr": 0.023576881744005705
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24113475177304963,
"acc_stderr": 0.02551873104953776,
"acc_norm": 0.24113475177304963,
"acc_norm_stderr": 0.02551873104953776
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25097783572359844,
"acc_stderr": 0.01107373029918723,
"acc_norm": 0.25097783572359844,
"acc_norm_stderr": 0.01107373029918723
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4227941176470588,
"acc_stderr": 0.030008562845003476,
"acc_norm": 0.4227941176470588,
"acc_norm_stderr": 0.030008562845003476
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.017322789207784326,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.017322789207784326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.19090909090909092,
"acc_stderr": 0.03764425585984926,
"acc_norm": 0.19090909090909092,
"acc_norm_stderr": 0.03764425585984926
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.030769444967296028,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.030769444967296028
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25301204819277107,
"acc_stderr": 0.033844291552331346,
"acc_norm": 0.25301204819277107,
"acc_norm_stderr": 0.033844291552331346
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22276621787025705,
"mc1_stderr": 0.01456650696139673,
"mc2": 0.3819742528315203,
"mc2_stderr": 0.015246089965112817
},
"harness|winogrande|5": {
"acc": 0.5209155485398579,
"acc_stderr": 0.014040185494212947
},
"harness|drop|3": {
"em": 0.00020973154362416107,
"em_stderr": 0.00014829481977280738,
"f1": 0.009905620805369138,
"f1_stderr": 0.0005041998138971091
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7194835543632507,
-0.8458877801895142,
0.28125882148742676,
0.17195342481136322,
-0.17307355999946594,
-0.04080367833375931,
0.011982060968875885,
-0.20326177775859833,
0.5764116644859314,
-0.052699021995067596,
-0.468481183052063,
-0.6945263147354126,
-0.48001471161842346,
0.2528888583... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_bofenghuang__vigostral-7b-chat_public | open-llm-leaderboard | 2023-11-13T15:33:15Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T15:33:15Z | 2023-11-13T15:32:26.000Z | 2023-11-13T15:32:26 | ---
pretty_name: Evaluation run of bofenghuang/vigostral-7b-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bofenghuang/vigostral-7b-chat](https://huggingface.co/bofenghuang/vigostral-7b-chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bofenghuang__vigostral-7b-chat_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-13T15:29:27.357304](https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigostral-7b-chat_public/blob/main/results_2023-11-13T15-29-27.357304.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6295084211098857,\n\
\ \"acc_stderr\": 0.03240910499451327,\n \"acc_norm\": 0.6386954674519838,\n\
\ \"acc_norm_stderr\": 0.03311457250909517,\n \"mc1\": 0.32802937576499386,\n\
\ \"mc1_stderr\": 0.01643563293281503,\n \"mc2\": 0.49239586566553695,\n\
\ \"mc2_stderr\": 0.014798235305508963,\n \"em\": 0.06596057046979865,\n\
\ \"em_stderr\": 0.0025419350983795505,\n \"f1\": 0.13260171979865745,\n\
\ \"f1_stderr\": 0.0027787818602447705\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5921501706484642,\n \"acc_stderr\": 0.0143610972884497,\n\
\ \"acc_norm\": 0.6262798634812287,\n \"acc_norm_stderr\": 0.014137708601759086\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6408086038637721,\n\
\ \"acc_stderr\": 0.004787829168255654,\n \"acc_norm\": 0.8433578968333001,\n\
\ \"acc_norm_stderr\": 0.0036272018740533918\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.032469569197899575,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.032469569197899575\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246487,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246487\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139403,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139403\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.02833560973246336,\n \"acc_norm\"\
: 0.803030303030303,\n \"acc_norm_stderr\": 0.02833560973246336\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \
\ \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3851851851851852,\n \"acc_stderr\": 0.029670906124630886,\n \
\ \"acc_norm\": 0.3851851851851852,\n \"acc_norm_stderr\": 0.029670906124630886\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n\
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612927,\n \"\
acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612927\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159263,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159263\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n\
\ \"acc_stderr\": 0.014143970276657569,\n \"acc_norm\": 0.8058748403575989,\n\
\ \"acc_norm_stderr\": 0.014143970276657569\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.02475241196091721,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.02475241196091721\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3039106145251397,\n\
\ \"acc_stderr\": 0.015382845587584518,\n \"acc_norm\": 0.3039106145251397,\n\
\ \"acc_norm_stderr\": 0.015382845587584518\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824775,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824775\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4452411994784876,\n\
\ \"acc_stderr\": 0.012693421303973294,\n \"acc_norm\": 0.4452411994784876,\n\
\ \"acc_norm_stderr\": 0.012693421303973294\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n\
\ \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360378,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360378\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533207,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533207\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32802937576499386,\n\
\ \"mc1_stderr\": 0.01643563293281503,\n \"mc2\": 0.49239586566553695,\n\
\ \"mc2_stderr\": 0.014798235305508963\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.01152446695409025\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.06596057046979865,\n \
\ \"em_stderr\": 0.0025419350983795505,\n \"f1\": 0.13260171979865745,\n\
\ \"f1_stderr\": 0.0027787818602447705\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.16755117513267628,\n \"acc_stderr\": 0.01028714369371122\n\
\ }\n}\n```"
repo_url: https://huggingface.co/bofenghuang/vigostral-7b-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|arc:challenge|25_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|drop|3_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|gsm8k|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hellaswag|10_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-29-27.357304.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T15-29-27.357304.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- '**/details_harness|winogrande|5_2023-11-13T15-29-27.357304.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-13T15-29-27.357304.parquet'
- config_name: results
data_files:
- split: 2023_11_13T15_29_27.357304
path:
- results_2023-11-13T15-29-27.357304.parquet
- split: latest
path:
- results_2023-11-13T15-29-27.357304.parquet
---
# Dataset Card for Evaluation run of bofenghuang/vigostral-7b-chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bofenghuang/vigostral-7b-chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [bofenghuang/vigostral-7b-chat](https://huggingface.co/bofenghuang/vigostral-7b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bofenghuang__vigostral-7b-chat_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-13T15:29:27.357304](https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigostral-7b-chat_public/blob/main/results_2023-11-13T15-29-27.357304.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6295084211098857,
"acc_stderr": 0.03240910499451327,
"acc_norm": 0.6386954674519838,
"acc_norm_stderr": 0.03311457250909517,
"mc1": 0.32802937576499386,
"mc1_stderr": 0.01643563293281503,
"mc2": 0.49239586566553695,
"mc2_stderr": 0.014798235305508963,
"em": 0.06596057046979865,
"em_stderr": 0.0025419350983795505,
"f1": 0.13260171979865745,
"f1_stderr": 0.0027787818602447705
},
"harness|arc:challenge|25": {
"acc": 0.5921501706484642,
"acc_stderr": 0.0143610972884497,
"acc_norm": 0.6262798634812287,
"acc_norm_stderr": 0.014137708601759086
},
"harness|hellaswag|10": {
"acc": 0.6408086038637721,
"acc_stderr": 0.004787829168255654,
"acc_norm": 0.8433578968333001,
"acc_norm_stderr": 0.0036272018740533918
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.03724249595817731,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.03724249595817731
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.032469569197899575,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.032469569197899575
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246487,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246487
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139403,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139403
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.02833560973246336,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.02833560973246336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.029670906124630886,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.029670906124630886
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612927,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612927
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159263,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159263
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.014143970276657569,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.014143970276657569
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.02475241196091721,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.02475241196091721
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3039106145251397,
"acc_stderr": 0.015382845587584518,
"acc_norm": 0.3039106145251397,
"acc_norm_stderr": 0.015382845587584518
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.024630048979824775,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.024630048979824775
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4452411994784876,
"acc_stderr": 0.012693421303973294,
"acc_norm": 0.4452411994784876,
"acc_norm_stderr": 0.012693421303973294
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.019047485239360378,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.019047485239360378
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533207,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533207
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32802937576499386,
"mc1_stderr": 0.01643563293281503,
"mc2": 0.49239586566553695,
"mc2_stderr": 0.014798235305508963
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.01152446695409025
},
"harness|drop|3": {
"em": 0.06596057046979865,
"em_stderr": 0.0025419350983795505,
"f1": 0.13260171979865745,
"f1_stderr": 0.0027787818602447705
},
"harness|gsm8k|5": {
"acc": 0.16755117513267628,
"acc_stderr": 0.01028714369371122
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7122208476066589,
-0.8903648257255554,
0.2513861060142517,
0.24327056109905243,
-0.20227780938148499,
-0.03731989860534668,
-0.015264502726495266,
-0.22924663126468658,
0.5810652375221252,
-0.025919035077095032,
-0.4822715222835541,
-0.7139671444892883,
-0.44450291991233826,
0.235895127... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_Norquinal__Mistral-7B-claude-instruct_public | open-llm-leaderboard | 2023-11-13T15:38:24Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T15:38:24Z | 2023-11-13T15:37:37.000Z | 2023-11-13T15:37:37 | ---
pretty_name: Evaluation run of Norquinal/Mistral-7B-claude-instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Norquinal/Mistral-7B-claude-instruct](https://huggingface.co/Norquinal/Mistral-7B-claude-instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Norquinal__Mistral-7B-claude-instruct_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-13T15:34:36.635642](https://huggingface.co/datasets/open-llm-leaderboard/details_Norquinal__Mistral-7B-claude-instruct_public/blob/main/results_2023-11-13T15-34-36.635642.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6328012974181245,\n\
\ \"acc_stderr\": 0.032347704149397305,\n \"acc_norm\": 0.6418533753559277,\n\
\ \"acc_norm_stderr\": 0.03304428598840875,\n \"mc1\": 0.32313341493268055,\n\
\ \"mc1_stderr\": 0.0163718362864546,\n \"mc2\": 0.4747061071538381,\n\
\ \"mc2_stderr\": 0.014816247527686706,\n \"em\": 0.0014681208053691276,\n\
\ \"em_stderr\": 0.0003921042190298484,\n \"f1\": 0.06348154362416109,\n\
\ \"f1_stderr\": 0.0013886897198441997\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6023890784982935,\n \"acc_stderr\": 0.01430175222327954,\n\
\ \"acc_norm\": 0.6322525597269625,\n \"acc_norm_stderr\": 0.014090995618168484\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6502688707428799,\n\
\ \"acc_stderr\": 0.00475910343238076,\n \"acc_norm\": 0.8499302927703645,\n\
\ \"acc_norm_stderr\": 0.003564098420387769\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.02443301646605246,\n \
\ \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.02443301646605246\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37777777777777777,\n \"acc_stderr\": 0.029560707392465718,\n \
\ \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.029560707392465718\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8110091743119267,\n \"acc_stderr\": 0.016785481159203627,\n \"\
acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.016785481159203627\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.033922384053216174,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.033922384053216174\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425174,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425174\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159256,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159256\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092382,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092382\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\
\ \"acc_stderr\": 0.013964393769899133,\n \"acc_norm\": 0.8122605363984674,\n\
\ \"acc_norm_stderr\": 0.013964393769899133\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247337,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247337\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3463687150837989,\n\
\ \"acc_stderr\": 0.015913546784020117,\n \"acc_norm\": 0.3463687150837989,\n\
\ \"acc_norm_stderr\": 0.015913546784020117\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982478,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982478\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140453,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140453\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829714,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829714\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4426336375488918,\n\
\ \"acc_stderr\": 0.012685906538206242,\n \"acc_norm\": 0.4426336375488918,\n\
\ \"acc_norm_stderr\": 0.012685906538206242\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406752,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406752\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360375,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360375\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291286,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291286\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.036845294917747115,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.036845294917747115\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32313341493268055,\n\
\ \"mc1_stderr\": 0.0163718362864546,\n \"mc2\": 0.4747061071538381,\n\
\ \"mc2_stderr\": 0.014816247527686706\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7813733228097869,\n \"acc_stderr\": 0.011616198215773239\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0014681208053691276,\n \
\ \"em_stderr\": 0.0003921042190298484,\n \"f1\": 0.06348154362416109,\n\
\ \"f1_stderr\": 0.0013886897198441997\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.17968157695223655,\n \"acc_stderr\": 0.010575119964242251\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Norquinal/Mistral-7B-claude-instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|arc:challenge|25_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|drop|3_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|gsm8k|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hellaswag|10_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|winogrande|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-13T15-34-36.635642.parquet'
- config_name: results
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- results_2023-11-13T15-34-36.635642.parquet
- split: latest
path:
- results_2023-11-13T15-34-36.635642.parquet
---
# Dataset Card for Evaluation run of Norquinal/Mistral-7B-claude-instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Norquinal/Mistral-7B-claude-instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Norquinal/Mistral-7B-claude-instruct](https://huggingface.co/Norquinal/Mistral-7B-claude-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Norquinal__Mistral-7B-claude-instruct_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-13T15:34:36.635642](https://huggingface.co/datasets/open-llm-leaderboard/details_Norquinal__Mistral-7B-claude-instruct_public/blob/main/results_2023-11-13T15-34-36.635642.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6328012974181245,
"acc_stderr": 0.032347704149397305,
"acc_norm": 0.6418533753559277,
"acc_norm_stderr": 0.03304428598840875,
"mc1": 0.32313341493268055,
"mc1_stderr": 0.0163718362864546,
"mc2": 0.4747061071538381,
"mc2_stderr": 0.014816247527686706,
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298484,
"f1": 0.06348154362416109,
"f1_stderr": 0.0013886897198441997
},
"harness|arc:challenge|25": {
"acc": 0.6023890784982935,
"acc_stderr": 0.01430175222327954,
"acc_norm": 0.6322525597269625,
"acc_norm_stderr": 0.014090995618168484
},
"harness|hellaswag|10": {
"acc": 0.6502688707428799,
"acc_stderr": 0.00475910343238076,
"acc_norm": 0.8499302927703645,
"acc_norm_stderr": 0.003564098420387769
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.02443301646605246,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.02443301646605246
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.029560707392465718,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.029560707392465718
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8110091743119267,
"acc_stderr": 0.016785481159203627,
"acc_norm": 0.8110091743119267,
"acc_norm_stderr": 0.016785481159203627
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.033922384053216174,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.033922384053216174
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159256,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092382,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092382
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899133,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899133
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247337,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247337
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3463687150837989,
"acc_stderr": 0.015913546784020117,
"acc_norm": 0.3463687150837989,
"acc_norm_stderr": 0.015913546784020117
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140453,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140453
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829714,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829714
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4426336375488918,
"acc_stderr": 0.012685906538206242,
"acc_norm": 0.4426336375488918,
"acc_norm_stderr": 0.012685906538206242
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406752,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406752
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.019047485239360375,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.019047485239360375
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291286,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291286
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.036845294917747115,
"acc_norm": 0.84,
"acc_norm_stderr": 0.036845294917747115
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32313341493268055,
"mc1_stderr": 0.0163718362864546,
"mc2": 0.4747061071538381,
"mc2_stderr": 0.014816247527686706
},
"harness|winogrande|5": {
"acc": 0.7813733228097869,
"acc_stderr": 0.011616198215773239
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298484,
"f1": 0.06348154362416109,
"f1_stderr": 0.0013886897198441997
},
"harness|gsm8k|5": {
"acc": 0.17968157695223655,
"acc_stderr": 0.010575119964242251
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.685055673122406,
-0.8381515145301819,
0.25487861037254333,
0.22414273023605347,
-0.1720876395702362,
-0.08317627012729645,
0.03514867275953293,
-0.20960737764835358,
0.5522747039794922,
-0.0046505131758749485,
-0.4870670437812805,
-0.7324369549751282,
-0.4279361367225647,
0.238773047924... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
brit2738/CORD19 | brit2738 | 2023-11-13T15:44:16Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T15:44:16Z | 2023-11-13T15:44:16.000Z | 2023-11-13T15:44:16 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Convector/dataforge-economics-standard | Convector | 2023-11-13T16:16:17Z | 0 | 0 | null | [
"task_categories:conversational",
"size_categories:n<1K",
"language:en",
"license:mit",
"finance",
"region:us"
] | 2023-11-13T16:16:17Z | 2023-11-13T15:45:04.000Z | 2023-11-13T15:45:04 | ---
license: mit
task_categories:
- conversational
language:
- en
tags:
- finance
size_categories:
- n<1K
---
---
**Dataset Summary:**
- **Name:** [Teknium DataForge Economics](https://huggingface.co/datasets/teknium/dataforge-economics)
- **Creator:** [Teknium](https://huggingface.co/teknium)
- **License:** MIT
- **Task Categories:** Conversational Analysis in Finance
- **Language:** English
- **Tags:** Finance, Economic Conversations
- **Size:** Less than 1,000 records
- **Format:** JSONL with standard Alpaca structure
- **Unique Features:**
- Fields: `instruction`, `input`, `output`, etc.
- Additional Fields: `origin` (dataset name), `conversation_id` (unique identifier for tracking conversations)
- **Additional Information:** For more detailed information about the dataset, visit [Teknium DataForge Economics on Hugging Face](https://huggingface.co/datasets/teknium/dataforge-economics).
Transform by [Convector](https://github.com/teilomillet/convector) with Love.
---
| [
-0.35901474952697754,
-1.0060043334960938,
0.10788783431053162,
0.5410946607589722,
-0.3855708837509155,
0.20152796804904938,
-0.3681223690509796,
-0.39088672399520874,
0.5467996597290039,
0.5896644592285156,
-0.783939778804779,
-0.8079172968864441,
-0.7074875235557556,
-0.1841239631175995... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_42dot__42dot_LLM-PLM-1.3B_public | open-llm-leaderboard | 2023-11-15T08:15:30Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T08:15:30Z | 2023-11-13T15:46:00.000Z | 2023-11-13T15:46:00 | ---
pretty_name: Evaluation run of 42dot/42dot_LLM-PLM-1.3B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [42dot/42dot_LLM-PLM-1.3B](https://huggingface.co/42dot/42dot_LLM-PLM-1.3B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_42dot__42dot_LLM-PLM-1.3B_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-15T08:12:34.029868](https://huggingface.co/datasets/open-llm-leaderboard/details_42dot__42dot_LLM-PLM-1.3B_public/blob/main/results_2023-11-15T08-12-34.029868.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2748396034833794,\n\
\ \"acc_stderr\": 0.03133274597965432,\n \"acc_norm\": 0.2767290148254369,\n\
\ \"acc_norm_stderr\": 0.032124763692846635,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931578,\n \"mc2\": 0.38680931810418795,\n\
\ \"mc2_stderr\": 0.013939564847231014,\n \"em\": 0.001153523489932886,\n\
\ \"em_stderr\": 0.0003476179896857114,\n \"f1\": 0.04587562919463095,\n\
\ \"f1_stderr\": 0.0011468980714363175\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.30119453924914674,\n \"acc_stderr\": 0.013406741767847627,\n\
\ \"acc_norm\": 0.3242320819112628,\n \"acc_norm_stderr\": 0.01367881039951882\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4287990440151364,\n\
\ \"acc_stderr\": 0.0049389301432344514,\n \"acc_norm\": 0.563931487751444,\n\
\ \"acc_norm_stderr\": 0.004948824501355477\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\
\ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\
\ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.041633319989322716,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.041633319989322716\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.03716177437566016,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.03716177437566016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.27167630057803466,\n\
\ \"acc_stderr\": 0.03391750322321659,\n \"acc_norm\": 0.27167630057803466,\n\
\ \"acc_norm_stderr\": 0.03391750322321659\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307809,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307809\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.31063829787234043,\n \"acc_stderr\": 0.03025123757921317,\n\
\ \"acc_norm\": 0.31063829787234043,\n \"acc_norm_stderr\": 0.03025123757921317\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.18620689655172415,\n \"acc_stderr\": 0.03243946159004616,\n\
\ \"acc_norm\": 0.18620689655172415,\n \"acc_norm_stderr\": 0.03243946159004616\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24074074074074073,\n \"acc_stderr\": 0.0220190800122179,\n \"\
acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.0220190800122179\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.038522733649243156,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.038522733649243156\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22580645161290322,\n\
\ \"acc_stderr\": 0.02378557788418101,\n \"acc_norm\": 0.22580645161290322,\n\
\ \"acc_norm_stderr\": 0.02378557788418101\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.031270907132976984,\n\
\ \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132976984\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.02962022787479047,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02962022787479047\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3471502590673575,\n \"acc_stderr\": 0.034356961683613546,\n\
\ \"acc_norm\": 0.3471502590673575,\n \"acc_norm_stderr\": 0.034356961683613546\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.34615384615384615,\n \"acc_stderr\": 0.024121125416941183,\n\
\ \"acc_norm\": 0.34615384615384615,\n \"acc_norm_stderr\": 0.024121125416941183\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230175,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230175\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.29831932773109243,\n \"acc_stderr\": 0.02971914287634286,\n\
\ \"acc_norm\": 0.29831932773109243,\n \"acc_norm_stderr\": 0.02971914287634286\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.26055045871559634,\n \"acc_stderr\": 0.01881918203485007,\n \"\
acc_norm\": 0.26055045871559634,\n \"acc_norm_stderr\": 0.01881918203485007\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2647058823529412,\n \"acc_stderr\": 0.0309645179269234,\n \"acc_norm\"\
: 0.2647058823529412,\n \"acc_norm_stderr\": 0.0309645179269234\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.23628691983122363,\n \"acc_stderr\": 0.027652153144159263,\n \"\
acc_norm\": 0.23628691983122363,\n \"acc_norm_stderr\": 0.027652153144159263\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.25112107623318386,\n\
\ \"acc_stderr\": 0.029105220833224622,\n \"acc_norm\": 0.25112107623318386,\n\
\ \"acc_norm_stderr\": 0.029105220833224622\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.20610687022900764,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.20610687022900764,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3884297520661157,\n \"acc_stderr\": 0.04449270350068382,\n \"\
acc_norm\": 0.3884297520661157,\n \"acc_norm_stderr\": 0.04449270350068382\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2037037037037037,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.2037037037037037,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.28205128205128205,\n\
\ \"acc_stderr\": 0.02948036054954119,\n \"acc_norm\": 0.28205128205128205,\n\
\ \"acc_norm_stderr\": 0.02948036054954119\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23243933588761176,\n\
\ \"acc_stderr\": 0.015104550008905706,\n \"acc_norm\": 0.23243933588761176,\n\
\ \"acc_norm_stderr\": 0.015104550008905706\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757177,\n\
\ \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757177\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2681564245810056,\n\
\ \"acc_stderr\": 0.014816119635317003,\n \"acc_norm\": 0.2681564245810056,\n\
\ \"acc_norm_stderr\": 0.014816119635317003\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.025829163272757485,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.025829163272757485\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3183279742765273,\n\
\ \"acc_stderr\": 0.02645722506781102,\n \"acc_norm\": 0.3183279742765273,\n\
\ \"acc_norm_stderr\": 0.02645722506781102\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967277,\n\
\ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967277\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2872340425531915,\n \"acc_stderr\": 0.026992199173064356,\n \
\ \"acc_norm\": 0.2872340425531915,\n \"acc_norm_stderr\": 0.026992199173064356\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n\
\ \"acc_stderr\": 0.010896123652676646,\n \"acc_norm\": 0.2392438070404172,\n\
\ \"acc_norm_stderr\": 0.010896123652676646\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329376,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329376\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2565359477124183,\n \"acc_stderr\": 0.017667841612378984,\n \
\ \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.017667841612378984\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n\
\ \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.24545454545454545,\n\
\ \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2693877551020408,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.2693877551020408,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.034605799075530255,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.034605799075530255\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.03377310252209194,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.03377310252209194\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931578,\n \"mc2\": 0.38680931810418795,\n\
\ \"mc2_stderr\": 0.013939564847231014\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5887924230465666,\n \"acc_stderr\": 0.013829128358676872\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.001153523489932886,\n \
\ \"em_stderr\": 0.0003476179896857114,\n \"f1\": 0.04587562919463095,\n\
\ \"f1_stderr\": 0.0011468980714363175\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0075815011372251705,\n \"acc_stderr\": 0.002389281512077207\n\
\ }\n}\n```"
repo_url: https://huggingface.co/42dot/42dot_LLM-PLM-1.3B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|arc:challenge|25_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|arc:challenge|25_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|drop|3_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|drop|3_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|gsm8k|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|gsm8k|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hellaswag|10_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hellaswag|10_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|winogrande|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|winogrande|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-15T08-12-34.029868.parquet'
- config_name: results
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- results_2023-11-13T15-43-12.146243.parquet
- split: 2023_11_15T08_12_34.029868
path:
- results_2023-11-15T08-12-34.029868.parquet
- split: latest
path:
- results_2023-11-15T08-12-34.029868.parquet
---
# Dataset Card for Evaluation run of 42dot/42dot_LLM-PLM-1.3B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/42dot/42dot_LLM-PLM-1.3B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [42dot/42dot_LLM-PLM-1.3B](https://huggingface.co/42dot/42dot_LLM-PLM-1.3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_42dot__42dot_LLM-PLM-1.3B_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-15T08:12:34.029868](https://huggingface.co/datasets/open-llm-leaderboard/details_42dot__42dot_LLM-PLM-1.3B_public/blob/main/results_2023-11-15T08-12-34.029868.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2748396034833794,
"acc_stderr": 0.03133274597965432,
"acc_norm": 0.2767290148254369,
"acc_norm_stderr": 0.032124763692846635,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931578,
"mc2": 0.38680931810418795,
"mc2_stderr": 0.013939564847231014,
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857114,
"f1": 0.04587562919463095,
"f1_stderr": 0.0011468980714363175
},
"harness|arc:challenge|25": {
"acc": 0.30119453924914674,
"acc_stderr": 0.013406741767847627,
"acc_norm": 0.3242320819112628,
"acc_norm_stderr": 0.01367881039951882
},
"harness|hellaswag|10": {
"acc": 0.4287990440151364,
"acc_stderr": 0.0049389301432344514,
"acc_norm": 0.563931487751444,
"acc_norm_stderr": 0.004948824501355477
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322716,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322716
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.03716177437566016,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.03716177437566016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.27167630057803466,
"acc_stderr": 0.03391750322321659,
"acc_norm": 0.27167630057803466,
"acc_norm_stderr": 0.03391750322321659
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.03793281185307809,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.03793281185307809
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.31063829787234043,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.31063829787234043,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.18620689655172415,
"acc_stderr": 0.03243946159004616,
"acc_norm": 0.18620689655172415,
"acc_norm_stderr": 0.03243946159004616
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.0220190800122179,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.0220190800122179
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.038522733649243156,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.038522733649243156
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22580645161290322,
"acc_stderr": 0.02378557788418101,
"acc_norm": 0.22580645161290322,
"acc_norm_stderr": 0.02378557788418101
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.031270907132976984,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.031270907132976984
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02962022787479047,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02962022787479047
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3471502590673575,
"acc_stderr": 0.034356961683613546,
"acc_norm": 0.3471502590673575,
"acc_norm_stderr": 0.034356961683613546
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34615384615384615,
"acc_stderr": 0.024121125416941183,
"acc_norm": 0.34615384615384615,
"acc_norm_stderr": 0.024121125416941183
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230175,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230175
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.29831932773109243,
"acc_stderr": 0.02971914287634286,
"acc_norm": 0.29831932773109243,
"acc_norm_stderr": 0.02971914287634286
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26055045871559634,
"acc_stderr": 0.01881918203485007,
"acc_norm": 0.26055045871559634,
"acc_norm_stderr": 0.01881918203485007
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.0309645179269234,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.0309645179269234
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.23628691983122363,
"acc_stderr": 0.027652153144159263,
"acc_norm": 0.23628691983122363,
"acc_norm_stderr": 0.027652153144159263
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.25112107623318386,
"acc_stderr": 0.029105220833224622,
"acc_norm": 0.25112107623318386,
"acc_norm_stderr": 0.029105220833224622
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.20610687022900764,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.20610687022900764,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3884297520661157,
"acc_stderr": 0.04449270350068382,
"acc_norm": 0.3884297520661157,
"acc_norm_stderr": 0.04449270350068382
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285713,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285713
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.28205128205128205,
"acc_stderr": 0.02948036054954119,
"acc_norm": 0.28205128205128205,
"acc_norm_stderr": 0.02948036054954119
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23243933588761176,
"acc_stderr": 0.015104550008905706,
"acc_norm": 0.23243933588761176,
"acc_norm_stderr": 0.015104550008905706
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.022075709251757177,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.022075709251757177
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2681564245810056,
"acc_stderr": 0.014816119635317003,
"acc_norm": 0.2681564245810056,
"acc_norm_stderr": 0.014816119635317003
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.025829163272757485,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.025829163272757485
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3183279742765273,
"acc_stderr": 0.02645722506781102,
"acc_norm": 0.3183279742765273,
"acc_norm_stderr": 0.02645722506781102
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.024659685185967277,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.024659685185967277
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2872340425531915,
"acc_stderr": 0.026992199173064356,
"acc_norm": 0.2872340425531915,
"acc_norm_stderr": 0.026992199173064356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676646,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676646
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.030187532060329376,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.030187532060329376
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2565359477124183,
"acc_stderr": 0.017667841612378984,
"acc_norm": 0.2565359477124183,
"acc_norm_stderr": 0.017667841612378984
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.041220665028782855,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.041220665028782855
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2693877551020408,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.2693877551020408,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.034605799075530255,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.034605799075530255
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.03377310252209194,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.03377310252209194
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931578,
"mc2": 0.38680931810418795,
"mc2_stderr": 0.013939564847231014
},
"harness|winogrande|5": {
"acc": 0.5887924230465666,
"acc_stderr": 0.013829128358676872
},
"harness|drop|3": {
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857114,
"f1": 0.04587562919463095,
"f1_stderr": 0.0011468980714363175
},
"harness|gsm8k|5": {
"acc": 0.0075815011372251705,
"acc_stderr": 0.002389281512077207
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7482534050941467,
-0.8523837924003601,
0.2853775918483734,
0.21472880244255066,
-0.18166758120059967,
-0.04462113231420517,
0.01186268962919712,
-0.21476152539253235,
0.564769446849823,
-0.03303992375731468,
-0.5099482536315918,
-0.7025724053382874,
-0.4291974902153015,
0.24567183852195... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_uukuguy__Mistral-7B-OpenOrca-lora_public | open-llm-leaderboard | 2023-11-13T15:48:06Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T15:48:06Z | 2023-11-13T15:47:20.000Z | 2023-11-13T15:47:20 | ---
pretty_name: Evaluation run of uukuguy/Mistral-7B-OpenOrca-lora
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/Mistral-7B-OpenOrca-lora](https://huggingface.co/uukuguy/Mistral-7B-OpenOrca-lora)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__Mistral-7B-OpenOrca-lora_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-13T15:44:18.785582](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__Mistral-7B-OpenOrca-lora_public/blob/main/results_2023-11-13T15-44-18.785582.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6351832920969729,\n\
\ \"acc_stderr\": 0.03210898212657927,\n \"acc_norm\": 0.6445450507876114,\n\
\ \"acc_norm_stderr\": 0.03280393070910138,\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.015785370858396725,\n \"mc2\": 0.4274271734982197,\n\
\ \"mc2_stderr\": 0.014247308828610854,\n \"em\": 0.0019924496644295304,\n\
\ \"em_stderr\": 0.00045666764626669387,\n \"f1\": 0.06191694630872485,\n\
\ \"f1_stderr\": 0.0013823026381279647\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5742320819112628,\n \"acc_stderr\": 0.014449464278868807,\n\
\ \"acc_norm\": 0.6194539249146758,\n \"acc_norm_stderr\": 0.014188277712349814\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6357299342760406,\n\
\ \"acc_stderr\": 0.004802413919932666,\n \"acc_norm\": 0.8361880103565027,\n\
\ \"acc_norm_stderr\": 0.003693484894179416\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.373015873015873,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\"\
: 0.373015873015873,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n\
\ \"acc_stderr\": 0.024362599693031096,\n \"acc_norm\": 0.7580645161290323,\n\
\ \"acc_norm_stderr\": 0.024362599693031096\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\"\
: 0.7626262626262627,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616258,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616258\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513536,\n \
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513536\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612927,\n \"\
acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612927\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.033922384053216174,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.033922384053216174\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159263,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159263\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.03157065078911901,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.03157065078911901\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294407003,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294407003\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3318435754189944,\n\
\ \"acc_stderr\": 0.015748421208187306,\n \"acc_norm\": 0.3318435754189944,\n\
\ \"acc_norm_stderr\": 0.015748421208187306\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.024748624490537375,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.024748624490537375\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45241199478487615,\n\
\ \"acc_stderr\": 0.012712265105889133,\n \"acc_norm\": 0.45241199478487615,\n\
\ \"acc_norm_stderr\": 0.012712265105889133\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093085,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093085\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.015785370858396725,\n \"mc2\": 0.4274271734982197,\n\
\ \"mc2_stderr\": 0.014247308828610854\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7908445146014207,\n \"acc_stderr\": 0.011430450045881575\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0019924496644295304,\n \
\ \"em_stderr\": 0.00045666764626669387,\n \"f1\": 0.06191694630872485,\n\
\ \"f1_stderr\": 0.0013823026381279647\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.1728582259287339,\n \"acc_stderr\": 0.010415432246200585\n\
\ }\n}\n```"
repo_url: https://huggingface.co/uukuguy/Mistral-7B-OpenOrca-lora
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|arc:challenge|25_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|drop|3_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|gsm8k|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hellaswag|10_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|winogrande|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-13T15-44-18.785582.parquet'
- config_name: results
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- results_2023-11-13T15-44-18.785582.parquet
- split: latest
path:
- results_2023-11-13T15-44-18.785582.parquet
---
# Dataset Card for Evaluation run of uukuguy/Mistral-7B-OpenOrca-lora
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/Mistral-7B-OpenOrca-lora
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/Mistral-7B-OpenOrca-lora](https://huggingface.co/uukuguy/Mistral-7B-OpenOrca-lora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__Mistral-7B-OpenOrca-lora_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-13T15:44:18.785582](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__Mistral-7B-OpenOrca-lora_public/blob/main/results_2023-11-13T15-44-18.785582.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6351832920969729,
"acc_stderr": 0.03210898212657927,
"acc_norm": 0.6445450507876114,
"acc_norm_stderr": 0.03280393070910138,
"mc1": 0.2839657282741738,
"mc1_stderr": 0.015785370858396725,
"mc2": 0.4274271734982197,
"mc2_stderr": 0.014247308828610854,
"em": 0.0019924496644295304,
"em_stderr": 0.00045666764626669387,
"f1": 0.06191694630872485,
"f1_stderr": 0.0013823026381279647
},
"harness|arc:challenge|25": {
"acc": 0.5742320819112628,
"acc_stderr": 0.014449464278868807,
"acc_norm": 0.6194539249146758,
"acc_norm_stderr": 0.014188277712349814
},
"harness|hellaswag|10": {
"acc": 0.6357299342760406,
"acc_stderr": 0.004802413919932666,
"acc_norm": 0.8361880103565027,
"acc_norm_stderr": 0.003693484894179416
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.02490699045899257,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.02490699045899257
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031096,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031096
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616258,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616258
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612927,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612927
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.033922384053216174,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.033922384053216174
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159263,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159263
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.03157065078911901,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.03157065078911901
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407003,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407003
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.02418242749657761,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.02418242749657761
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3318435754189944,
"acc_stderr": 0.015748421208187306,
"acc_norm": 0.3318435754189944,
"acc_norm_stderr": 0.015748421208187306
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.024748624490537375,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.024748624490537375
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45241199478487615,
"acc_stderr": 0.012712265105889133,
"acc_norm": 0.45241199478487615,
"acc_norm_stderr": 0.012712265105889133
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093085,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093085
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2839657282741738,
"mc1_stderr": 0.015785370858396725,
"mc2": 0.4274271734982197,
"mc2_stderr": 0.014247308828610854
},
"harness|winogrande|5": {
"acc": 0.7908445146014207,
"acc_stderr": 0.011430450045881575
},
"harness|drop|3": {
"em": 0.0019924496644295304,
"em_stderr": 0.00045666764626669387,
"f1": 0.06191694630872485,
"f1_stderr": 0.0013823026381279647
},
"harness|gsm8k|5": {
"acc": 0.1728582259287339,
"acc_stderr": 0.010415432246200585
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7210410833358765,
-0.8578752875328064,
0.2582551836967468,
0.20452243089675903,
-0.19061145186424255,
-0.10893271863460541,
0.016592709347605705,
-0.23280510306358337,
0.5877587199211121,
0.009741395711898804,
-0.4606779217720032,
-0.7179926037788391,
-0.39756491780281067,
0.22031448781... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Convector/dataforge-economics-CC | Convector | 2023-11-13T16:18:47Z | 0 | 0 | null | [
"task_categories:conversational",
"size_categories:n<1K",
"language:en",
"license:mit",
"finance",
"region:us"
] | 2023-11-13T16:18:47Z | 2023-11-13T15:49:39.000Z | 2023-11-13T15:49:39 | ---
license: mit
task_categories:
- conversational
language:
- en
tags:
- finance
size_categories:
- n<1K
---
---
**Dataset Summary:**
- **Name:** [Teknium DataForge Economics](https://huggingface.co/datasets/teknium/dataforge-economics)
- **Creator:** [Teknium](https://huggingface.co/teknium)
- **License:** MIT
- **Task Categories:** Conversational Analysis in Finance
- **Language:** English
- **Tags:** Finance, Economic Conversations
- **Size:** Less than 1,000 records
- **Format:** JSONL with chat completion OpenAI structure
- **Unique Features:**
- Fields: `"messages": [
{"role": "system", "content": ""},
{"role": "user", "content": ""},
{"role": "assistant", "content": ""}
],
"":""`
- Additional Fields: `origin` (dataset name), `conversation_id` (unique identifier for tracking conversations)
- **Additional Information:** For more detailed information about the dataset, visit [Teknium DataForge Economics on Hugging Face](https://huggingface.co/datasets/teknium/dataforge-economics).
Transform by [Convector](https://github.com/teilomillet/convector) with Love.
---
| [
-0.2648431062698364,
-0.9068666100502014,
0.01876073144376278,
0.4213740825653076,
-0.3291372060775757,
0.2627105116844177,
-0.4764030873775482,
-0.2844245433807373,
0.47374868392944336,
0.6006498336791992,
-0.7350406050682068,
-0.7957708835601807,
-0.6487157344818115,
-0.26535382866859436... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_42dot__42dot_LLM-SFT-1.3B_public | open-llm-leaderboard | 2023-11-13T15:50:51Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T15:50:51Z | 2023-11-13T15:50:04.000Z | 2023-11-13T15:50:04 | ---
pretty_name: Evaluation run of 42dot/42dot_LLM-SFT-1.3B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [42dot/42dot_LLM-SFT-1.3B](https://huggingface.co/42dot/42dot_LLM-SFT-1.3B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_42dot__42dot_LLM-SFT-1.3B_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-13T15:47:16.910477](https://huggingface.co/datasets/open-llm-leaderboard/details_42dot__42dot_LLM-SFT-1.3B_public/blob/main/results_2023-11-13T15-47-16.910477.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26083934068438247,\n\
\ \"acc_stderr\": 0.03100224322901986,\n \"acc_norm\": 0.262585495126005,\n\
\ \"acc_norm_stderr\": 0.031783041105593664,\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.39979776889587376,\n\
\ \"mc2_stderr\": 0.014420445519552157,\n \"em\": 0.01583473154362416,\n\
\ \"em_stderr\": 0.0012784360866061313,\n \"f1\": 0.07108431208053706,\n\
\ \"f1_stderr\": 0.0017891407240589372\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3361774744027304,\n \"acc_stderr\": 0.013804855026205758,\n\
\ \"acc_norm\": 0.3609215017064846,\n \"acc_norm_stderr\": 0.01403476138617546\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.44214299940250945,\n\
\ \"acc_stderr\": 0.004956262919324398,\n \"acc_norm\": 0.5896235809599681,\n\
\ \"acc_norm_stderr\": 0.004908967278222482\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653697,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653697\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2074074074074074,\n\
\ \"acc_stderr\": 0.035025531706783165,\n \"acc_norm\": 0.2074074074074074,\n\
\ \"acc_norm_stderr\": 0.035025531706783165\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03317672787533157,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03317672787533157\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.025125766484827845,\n\
\ \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.025125766484827845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n\
\ \"acc_stderr\": 0.033450369167889904,\n \"acc_norm\": 0.26011560693641617,\n\
\ \"acc_norm_stderr\": 0.033450369167889904\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2936170212765957,\n \"acc_stderr\": 0.029771642712491223,\n\
\ \"acc_norm\": 0.2936170212765957,\n \"acc_norm_stderr\": 0.029771642712491223\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n\
\ \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23015873015873015,\n \"acc_stderr\": 0.02167921966369315,\n \"\
acc_norm\": 0.23015873015873015,\n \"acc_norm_stderr\": 0.02167921966369315\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.18064516129032257,\n \"acc_stderr\": 0.02188617856717255,\n \"\
acc_norm\": 0.18064516129032257,\n \"acc_norm_stderr\": 0.02188617856717255\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.23645320197044334,\n \"acc_stderr\": 0.02989611429173354,\n \"\
acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.02989611429173354\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139404,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139404\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.18181818181818182,\n \"acc_stderr\": 0.027479603010538797,\n \"\
acc_norm\": 0.18181818181818182,\n \"acc_norm_stderr\": 0.027479603010538797\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23316062176165803,\n \"acc_stderr\": 0.030516111371476008,\n\
\ \"acc_norm\": 0.23316062176165803,\n \"acc_norm_stderr\": 0.030516111371476008\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24615384615384617,\n \"acc_stderr\": 0.021840866990423084,\n\
\ \"acc_norm\": 0.24615384615384617,\n \"acc_norm_stderr\": 0.021840866990423084\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.02755361446786382,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02755361446786382\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.20917431192660552,\n \"acc_stderr\": 0.017437937173343226,\n \"\
acc_norm\": 0.20917431192660552,\n \"acc_norm_stderr\": 0.017437937173343226\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.0316746870682898,\n \"acc_norm\"\
: 0.3148148148148148,\n \"acc_norm_stderr\": 0.0316746870682898\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2107843137254902,\n\
\ \"acc_stderr\": 0.028626547912437388,\n \"acc_norm\": 0.2107843137254902,\n\
\ \"acc_norm_stderr\": 0.028626547912437388\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.2911392405063291,\n \"acc_stderr\": 0.02957160106575337,\n\
\ \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.02957160106575337\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.28699551569506726,\n\
\ \"acc_stderr\": 0.030360379710291954,\n \"acc_norm\": 0.28699551569506726,\n\
\ \"acc_norm_stderr\": 0.030360379710291954\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2948717948717949,\n\
\ \"acc_stderr\": 0.029872577708891148,\n \"acc_norm\": 0.2948717948717949,\n\
\ \"acc_norm_stderr\": 0.029872577708891148\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23499361430395913,\n\
\ \"acc_stderr\": 0.015162024152278434,\n \"acc_norm\": 0.23499361430395913,\n\
\ \"acc_norm_stderr\": 0.015162024152278434\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2514450867052023,\n \"acc_stderr\": 0.023357365785874037,\n\
\ \"acc_norm\": 0.2514450867052023,\n \"acc_norm_stderr\": 0.023357365785874037\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.024288619466046112,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.024288619466046112\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.2733118971061093,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2345679012345679,\n \"acc_stderr\": 0.023576881744005723,\n\
\ \"acc_norm\": 0.2345679012345679,\n \"acc_norm_stderr\": 0.023576881744005723\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2198581560283688,\n \"acc_stderr\": 0.02470614107070547,\n \
\ \"acc_norm\": 0.2198581560283688,\n \"acc_norm_stderr\": 0.02470614107070547\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24771838331160365,\n\
\ \"acc_stderr\": 0.01102549929144374,\n \"acc_norm\": 0.24771838331160365,\n\
\ \"acc_norm_stderr\": 0.01102549929144374\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.29044117647058826,\n \"acc_stderr\": 0.027576468622740512,\n\
\ \"acc_norm\": 0.29044117647058826,\n \"acc_norm_stderr\": 0.027576468622740512\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2630718954248366,\n \"acc_stderr\": 0.017812676542320657,\n \
\ \"acc_norm\": 0.2630718954248366,\n \"acc_norm_stderr\": 0.017812676542320657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072775,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072775\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.20816326530612245,\n \"acc_stderr\": 0.025991117672813292,\n\
\ \"acc_norm\": 0.20816326530612245,\n \"acc_norm_stderr\": 0.025991117672813292\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.030360490154014652,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.030360490154014652\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.034605799075530276,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.034605799075530276\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.30994152046783624,\n \"acc_stderr\": 0.03546976959393163,\n\
\ \"acc_norm\": 0.30994152046783624,\n \"acc_norm_stderr\": 0.03546976959393163\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.39979776889587376,\n\
\ \"mc2_stderr\": 0.014420445519552157\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5840568271507498,\n \"acc_stderr\": 0.013852485356798255\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.01583473154362416,\n \
\ \"em_stderr\": 0.0012784360866061313,\n \"f1\": 0.07108431208053706,\n\
\ \"f1_stderr\": 0.0017891407240589372\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.006823351023502654,\n \"acc_stderr\": 0.0022675371022544996\n\
\ }\n}\n```"
repo_url: https://huggingface.co/42dot/42dot_LLM-SFT-1.3B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|arc:challenge|25_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|drop|3_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|gsm8k|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hellaswag|10_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-47-16.910477.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T15-47-16.910477.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- '**/details_harness|winogrande|5_2023-11-13T15-47-16.910477.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-13T15-47-16.910477.parquet'
- config_name: results
data_files:
- split: 2023_11_13T15_47_16.910477
path:
- results_2023-11-13T15-47-16.910477.parquet
- split: latest
path:
- results_2023-11-13T15-47-16.910477.parquet
---
# Dataset Card for Evaluation run of 42dot/42dot_LLM-SFT-1.3B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/42dot/42dot_LLM-SFT-1.3B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [42dot/42dot_LLM-SFT-1.3B](https://huggingface.co/42dot/42dot_LLM-SFT-1.3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_42dot__42dot_LLM-SFT-1.3B_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-13T15:47:16.910477](https://huggingface.co/datasets/open-llm-leaderboard/details_42dot__42dot_LLM-SFT-1.3B_public/blob/main/results_2023-11-13T15-47-16.910477.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26083934068438247,
"acc_stderr": 0.03100224322901986,
"acc_norm": 0.262585495126005,
"acc_norm_stderr": 0.031783041105593664,
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.39979776889587376,
"mc2_stderr": 0.014420445519552157,
"em": 0.01583473154362416,
"em_stderr": 0.0012784360866061313,
"f1": 0.07108431208053706,
"f1_stderr": 0.0017891407240589372
},
"harness|arc:challenge|25": {
"acc": 0.3361774744027304,
"acc_stderr": 0.013804855026205758,
"acc_norm": 0.3609215017064846,
"acc_norm_stderr": 0.01403476138617546
},
"harness|hellaswag|10": {
"acc": 0.44214299940250945,
"acc_stderr": 0.004956262919324398,
"acc_norm": 0.5896235809599681,
"acc_norm_stderr": 0.004908967278222482
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653697,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653697
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2074074074074074,
"acc_stderr": 0.035025531706783165,
"acc_norm": 0.2074074074074074,
"acc_norm_stderr": 0.035025531706783165
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21132075471698114,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.21132075471698114,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.033450369167889904,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.033450369167889904
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2936170212765957,
"acc_stderr": 0.029771642712491223,
"acc_norm": 0.2936170212765957,
"acc_norm_stderr": 0.029771642712491223
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.02167921966369315,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.02167921966369315
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.18064516129032257,
"acc_stderr": 0.02188617856717255,
"acc_norm": 0.18064516129032257,
"acc_norm_stderr": 0.02188617856717255
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.02989611429173354,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.02989611429173354
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.027479603010538797,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.027479603010538797
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23316062176165803,
"acc_stderr": 0.030516111371476008,
"acc_norm": 0.23316062176165803,
"acc_norm_stderr": 0.030516111371476008
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24615384615384617,
"acc_stderr": 0.021840866990423084,
"acc_norm": 0.24615384615384617,
"acc_norm_stderr": 0.021840866990423084
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.02755361446786382,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.02755361446786382
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.20917431192660552,
"acc_stderr": 0.017437937173343226,
"acc_norm": 0.20917431192660552,
"acc_norm_stderr": 0.017437937173343226
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.0316746870682898,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.0316746870682898
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2107843137254902,
"acc_stderr": 0.028626547912437388,
"acc_norm": 0.2107843137254902,
"acc_norm_stderr": 0.028626547912437388
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2911392405063291,
"acc_stderr": 0.02957160106575337,
"acc_norm": 0.2911392405063291,
"acc_norm_stderr": 0.02957160106575337
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.28699551569506726,
"acc_stderr": 0.030360379710291954,
"acc_norm": 0.28699551569506726,
"acc_norm_stderr": 0.030360379710291954
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.029872577708891148,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.029872577708891148
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23499361430395913,
"acc_stderr": 0.015162024152278434,
"acc_norm": 0.23499361430395913,
"acc_norm_stderr": 0.015162024152278434
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2514450867052023,
"acc_stderr": 0.023357365785874037,
"acc_norm": 0.2514450867052023,
"acc_norm_stderr": 0.023357365785874037
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.024288619466046112,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.024288619466046112
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2345679012345679,
"acc_stderr": 0.023576881744005723,
"acc_norm": 0.2345679012345679,
"acc_norm_stderr": 0.023576881744005723
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2198581560283688,
"acc_stderr": 0.02470614107070547,
"acc_norm": 0.2198581560283688,
"acc_norm_stderr": 0.02470614107070547
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24771838331160365,
"acc_stderr": 0.01102549929144374,
"acc_norm": 0.24771838331160365,
"acc_norm_stderr": 0.01102549929144374
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.29044117647058826,
"acc_stderr": 0.027576468622740512,
"acc_norm": 0.29044117647058826,
"acc_norm_stderr": 0.027576468622740512
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2630718954248366,
"acc_stderr": 0.017812676542320657,
"acc_norm": 0.2630718954248366,
"acc_norm_stderr": 0.017812676542320657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072775,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.20816326530612245,
"acc_stderr": 0.025991117672813292,
"acc_norm": 0.20816326530612245,
"acc_norm_stderr": 0.025991117672813292
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014652,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014652
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.034605799075530276,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.034605799075530276
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30994152046783624,
"acc_stderr": 0.03546976959393163,
"acc_norm": 0.30994152046783624,
"acc_norm_stderr": 0.03546976959393163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.39979776889587376,
"mc2_stderr": 0.014420445519552157
},
"harness|winogrande|5": {
"acc": 0.5840568271507498,
"acc_stderr": 0.013852485356798255
},
"harness|drop|3": {
"em": 0.01583473154362416,
"em_stderr": 0.0012784360866061313,
"f1": 0.07108431208053706,
"f1_stderr": 0.0017891407240589372
},
"harness|gsm8k|5": {
"acc": 0.006823351023502654,
"acc_stderr": 0.0022675371022544996
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7465877532958984,
-0.8885831832885742,
0.29727503657341003,
0.21670371294021606,
-0.17458829283714294,
-0.03915632888674736,
0.027449633926153183,
-0.20998118817806244,
0.5845320224761963,
-0.028598425909876823,
-0.515114426612854,
-0.6785861253738403,
-0.4275011718273163,
0.25810331106... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-Mk2-7b_public | open-llm-leaderboard | 2023-11-13T15:56:31Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T15:56:31Z | 2023-11-13T15:55:45.000Z | 2023-11-13T15:55:45 | ---
pretty_name: Evaluation run of PocketDoc/Dans-AdventurousWinds-Mk2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PocketDoc/Dans-AdventurousWinds-Mk2-7b](https://huggingface.co/PocketDoc/Dans-AdventurousWinds-Mk2-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-Mk2-7b_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-13T15:52:43.892204](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-Mk2-7b_public/blob/main/results_2023-11-13T15-52-43.892204.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6117457883588181,\n\
\ \"acc_stderr\": 0.03285127869008788,\n \"acc_norm\": 0.621056172344861,\n\
\ \"acc_norm_stderr\": 0.033574977794886766,\n \"mc1\": 0.28886168910648713,\n\
\ \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": 0.43563008850906,\n\
\ \"mc2_stderr\": 0.014459760341061523,\n \"em\": 0.0018875838926174498,\n\
\ \"em_stderr\": 0.00044451099905589315,\n \"f1\": 0.06191904362416096,\n\
\ \"f1_stderr\": 0.0014055022875998687\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.53839590443686,\n \"acc_stderr\": 0.014568245550296354,\n\
\ \"acc_norm\": 0.5819112627986348,\n \"acc_norm_stderr\": 0.014413988396996077\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6399123680541725,\n\
\ \"acc_stderr\": 0.004790445139186366,\n \"acc_norm\": 0.8347938657637921,\n\
\ \"acc_norm_stderr\": 0.003706075184380282\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n\
\ \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.041657747757287644,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.041657747757287644\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n\
\ \"acc_stderr\": 0.024472243840895525,\n \"acc_norm\": 0.7548387096774194,\n\
\ \"acc_norm_stderr\": 0.024472243840895525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633507,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633507\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7981651376146789,\n \"acc_stderr\": 0.017208579357787575,\n \"\
acc_norm\": 0.7981651376146789,\n \"acc_norm_stderr\": 0.017208579357787575\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5787037037037037,\n \"acc_stderr\": 0.03367462138896078,\n \"\
acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.03367462138896078\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967407,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967407\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.0283046579430353,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.0283046579430353\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728742,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728742\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.02336505149175372,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.02336505149175372\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.789272030651341,\n\
\ \"acc_stderr\": 0.014583812465862541,\n \"acc_norm\": 0.789272030651341,\n\
\ \"acc_norm_stderr\": 0.014583812465862541\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977254,\n\
\ \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977254\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3240223463687151,\n\
\ \"acc_stderr\": 0.015652542496421118,\n \"acc_norm\": 0.3240223463687151,\n\
\ \"acc_norm_stderr\": 0.015652542496421118\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.026664410886937613,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.026664410886937613\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.025842248700902168,\n\
\ \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.025842248700902168\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4028683181225554,\n\
\ \"acc_stderr\": 0.012526955577118016,\n \"acc_norm\": 0.4028683181225554,\n\
\ \"acc_norm_stderr\": 0.012526955577118016\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6323529411764706,\n \"acc_stderr\": 0.019506291693954854,\n \
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.019506291693954854\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6489795918367347,\n \"acc_stderr\": 0.03055531675557364,\n\
\ \"acc_norm\": 0.6489795918367347,\n \"acc_norm_stderr\": 0.03055531675557364\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7910447761194029,\n\
\ \"acc_stderr\": 0.028748298931728655,\n \"acc_norm\": 0.7910447761194029,\n\
\ \"acc_norm_stderr\": 0.028748298931728655\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28886168910648713,\n\
\ \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": 0.43563008850906,\n\
\ \"mc2_stderr\": 0.014459760341061523\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7632202052091555,\n \"acc_stderr\": 0.011947592365207394\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0018875838926174498,\n \
\ \"em_stderr\": 0.00044451099905589315,\n \"f1\": 0.06191904362416096,\n\
\ \"f1_stderr\": 0.0014055022875998687\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.14935557240333586,\n \"acc_stderr\": 0.009818090723727293\n\
\ }\n}\n```"
repo_url: https://huggingface.co/PocketDoc/Dans-AdventurousWinds-Mk2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|arc:challenge|25_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|drop|3_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|gsm8k|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hellaswag|10_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-52-43.892204.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T15-52-43.892204.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- '**/details_harness|winogrande|5_2023-11-13T15-52-43.892204.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-13T15-52-43.892204.parquet'
- config_name: results
data_files:
- split: 2023_11_13T15_52_43.892204
path:
- results_2023-11-13T15-52-43.892204.parquet
- split: latest
path:
- results_2023-11-13T15-52-43.892204.parquet
---
# Dataset Card for Evaluation run of PocketDoc/Dans-AdventurousWinds-Mk2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PocketDoc/Dans-AdventurousWinds-Mk2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PocketDoc/Dans-AdventurousWinds-Mk2-7b](https://huggingface.co/PocketDoc/Dans-AdventurousWinds-Mk2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-Mk2-7b_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-13T15:52:43.892204](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-Mk2-7b_public/blob/main/results_2023-11-13T15-52-43.892204.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6117457883588181,
"acc_stderr": 0.03285127869008788,
"acc_norm": 0.621056172344861,
"acc_norm_stderr": 0.033574977794886766,
"mc1": 0.28886168910648713,
"mc1_stderr": 0.01586634640138431,
"mc2": 0.43563008850906,
"mc2_stderr": 0.014459760341061523,
"em": 0.0018875838926174498,
"em_stderr": 0.00044451099905589315,
"f1": 0.06191904362416096,
"f1_stderr": 0.0014055022875998687
},
"harness|arc:challenge|25": {
"acc": 0.53839590443686,
"acc_stderr": 0.014568245550296354,
"acc_norm": 0.5819112627986348,
"acc_norm_stderr": 0.014413988396996077
},
"harness|hellaswag|10": {
"acc": 0.6399123680541725,
"acc_stderr": 0.004790445139186366,
"acc_norm": 0.8347938657637921,
"acc_norm_stderr": 0.003706075184380282
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.047240073523838876,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.047240073523838876
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.041657747757287644,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.041657747757287644
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895525,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633507,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633507
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7981651376146789,
"acc_stderr": 0.017208579357787575,
"acc_norm": 0.7981651376146789,
"acc_norm_stderr": 0.017208579357787575
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5787037037037037,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.5787037037037037,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967407,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967407
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.0283046579430353,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.0283046579430353
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728742,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728742
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.02336505149175372,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.02336505149175372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.789272030651341,
"acc_stderr": 0.014583812465862541,
"acc_norm": 0.789272030651341,
"acc_norm_stderr": 0.014583812465862541
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977254,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977254
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3240223463687151,
"acc_stderr": 0.015652542496421118,
"acc_norm": 0.3240223463687151,
"acc_norm_stderr": 0.015652542496421118
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.026664410886937613,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.026664410886937613
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.025842248700902168,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.025842248700902168
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4028683181225554,
"acc_stderr": 0.012526955577118016,
"acc_norm": 0.4028683181225554,
"acc_norm_stderr": 0.012526955577118016
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.019506291693954854,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.019506291693954854
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6489795918367347,
"acc_stderr": 0.03055531675557364,
"acc_norm": 0.6489795918367347,
"acc_norm_stderr": 0.03055531675557364
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7910447761194029,
"acc_stderr": 0.028748298931728655,
"acc_norm": 0.7910447761194029,
"acc_norm_stderr": 0.028748298931728655
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28886168910648713,
"mc1_stderr": 0.01586634640138431,
"mc2": 0.43563008850906,
"mc2_stderr": 0.014459760341061523
},
"harness|winogrande|5": {
"acc": 0.7632202052091555,
"acc_stderr": 0.011947592365207394
},
"harness|drop|3": {
"em": 0.0018875838926174498,
"em_stderr": 0.00044451099905589315,
"f1": 0.06191904362416096,
"f1_stderr": 0.0014055022875998687
},
"harness|gsm8k|5": {
"acc": 0.14935557240333586,
"acc_stderr": 0.009818090723727293
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7203882932662964,
-0.8665869235992432,
0.3272631764411926,
0.2003854364156723,
-0.20171645283699036,
-0.04977787658572197,
-0.014599982649087906,
-0.1907990723848343,
0.558907151222229,
-0.009098903276026249,
-0.4964723587036133,
-0.7073187232017517,
-0.4489246606826782,
0.2318182438611... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Weizong/Wei | Weizong | 2023-11-13T16:00:00Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T16:00:00Z | 2023-11-13T16:00:00.000Z | 2023-11-13T16:00:00 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_w95__megachat_public | open-llm-leaderboard | 2023-11-13T16:02:37Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T16:02:37Z | 2023-11-13T16:01:50.000Z | 2023-11-13T16:01:50 | ---
pretty_name: Evaluation run of w95/megachat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [w95/megachat](https://huggingface.co/w95/megachat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_w95__megachat_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-13T15:59:20.049368](https://huggingface.co/datasets/open-llm-leaderboard/details_w95__megachat_public/blob/main/results_2023-11-13T15-59-20.049368.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25936163462487405,\n\
\ \"acc_stderr\": 0.03091692313677521,\n \"acc_norm\": 0.26122603283331186,\n\
\ \"acc_norm_stderr\": 0.031692702511721224,\n \"mc1\": 0.24479804161566707,\n\
\ \"mc1_stderr\": 0.015051869486715014,\n \"mc2\": 0.39854544628414945,\n\
\ \"mc2_stderr\": 0.014106781910887378,\n \"em\": 0.0006291946308724832,\n\
\ \"em_stderr\": 0.00025680027497239604,\n \"f1\": 0.041603397651006783,\n\
\ \"f1_stderr\": 0.0011146754682383132\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.27047781569965873,\n \"acc_stderr\": 0.012980954547659554,\n\
\ \"acc_norm\": 0.30802047781569963,\n \"acc_norm_stderr\": 0.01349142951729204\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4100776737701653,\n\
\ \"acc_stderr\": 0.004908423147162023,\n \"acc_norm\": 0.5435172276438957,\n\
\ \"acc_norm_stderr\": 0.004970846697552307\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3092105263157895,\n \"acc_stderr\": 0.03761070869867479,\n\
\ \"acc_norm\": 0.3092105263157895,\n \"acc_norm_stderr\": 0.03761070869867479\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.02512576648482784,\n\
\ \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.02512576648482784\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n\
\ \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.24855491329479767,\n\
\ \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.20425531914893616,\n \"acc_stderr\": 0.026355158413349424,\n\
\ \"acc_norm\": 0.20425531914893616,\n \"acc_norm_stderr\": 0.026355158413349424\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309993,\n\
\ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309993\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2698412698412698,\n \"acc_stderr\": 0.022860838309232072,\n \"\
acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.022860838309232072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
\ \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n\
\ \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n\
\ \"acc_stderr\": 0.024685979286239956,\n \"acc_norm\": 0.25161290322580643,\n\
\ \"acc_norm_stderr\": 0.024685979286239956\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.031089826002937523,\n\
\ \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.031089826002937523\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20202020202020202,\n \"acc_stderr\": 0.02860620428922988,\n \"\
acc_norm\": 0.20202020202020202,\n \"acc_norm_stderr\": 0.02860620428922988\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2538860103626943,\n \"acc_stderr\": 0.03141024780565319,\n\
\ \"acc_norm\": 0.2538860103626943,\n \"acc_norm_stderr\": 0.03141024780565319\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2230769230769231,\n \"acc_stderr\": 0.021107730127243995,\n\
\ \"acc_norm\": 0.2230769230769231,\n \"acc_norm_stderr\": 0.021107730127243995\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507383,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507383\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.181651376146789,\n\
\ \"acc_stderr\": 0.01653061740926686,\n \"acc_norm\": 0.181651376146789,\n\
\ \"acc_norm_stderr\": 0.01653061740926686\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.02792096314799366,\n\
\ \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.02792096314799366\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604243,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604243\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.25738396624472576,\n \"acc_stderr\": 0.028458820991460302,\n \
\ \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.028458820991460302\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.20179372197309417,\n\
\ \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.20179372197309417,\n\
\ \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728745,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728745\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.36363636363636365,\n \"acc_stderr\": 0.04391326286724071,\n \"\
acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.04391326286724071\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.041331194402438376,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.041331194402438376\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n\
\ \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n\
\ \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n\
\ \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n\
\ \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.25287356321839083,\n\
\ \"acc_stderr\": 0.015543377313719681,\n \"acc_norm\": 0.25287356321839083,\n\
\ \"acc_norm_stderr\": 0.015543377313719681\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.29190751445086704,\n \"acc_stderr\": 0.02447699407624734,\n\
\ \"acc_norm\": 0.29190751445086704,\n \"acc_norm_stderr\": 0.02447699407624734\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.31790123456790126,\n \"acc_stderr\": 0.025910063528240875,\n\
\ \"acc_norm\": 0.31790123456790126,\n \"acc_norm_stderr\": 0.025910063528240875\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.26595744680851063,\n \"acc_stderr\": 0.02635806569888059,\n \
\ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.02635806569888059\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27183833116036504,\n\
\ \"acc_stderr\": 0.011363135278651411,\n \"acc_norm\": 0.27183833116036504,\n\
\ \"acc_norm_stderr\": 0.011363135278651411\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.17279411764705882,\n \"acc_stderr\": 0.022966067585581756,\n\
\ \"acc_norm\": 0.17279411764705882,\n \"acc_norm_stderr\": 0.022966067585581756\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2696078431372549,\n \"acc_stderr\": 0.017952449196987866,\n \
\ \"acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.017952449196987866\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.040139645540727735,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.040139645540727735\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24897959183673468,\n \"acc_stderr\": 0.027682979522960227,\n\
\ \"acc_norm\": 0.24897959183673468,\n \"acc_norm_stderr\": 0.027682979522960227\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2469879518072289,\n\
\ \"acc_stderr\": 0.03357351982064536,\n \"acc_norm\": 0.2469879518072289,\n\
\ \"acc_norm_stderr\": 0.03357351982064536\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3742690058479532,\n \"acc_stderr\": 0.037116011853894806,\n\
\ \"acc_norm\": 0.3742690058479532,\n \"acc_norm_stderr\": 0.037116011853894806\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24479804161566707,\n\
\ \"mc1_stderr\": 0.015051869486715014,\n \"mc2\": 0.39854544628414945,\n\
\ \"mc2_stderr\": 0.014106781910887378\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5698500394632992,\n \"acc_stderr\": 0.013914685094716698\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0006291946308724832,\n \
\ \"em_stderr\": 0.00025680027497239604,\n \"f1\": 0.041603397651006783,\n\
\ \"f1_stderr\": 0.0011146754682383132\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.009855951478392721,\n \"acc_stderr\": 0.0027210765770416586\n\
\ }\n}\n```"
repo_url: https://huggingface.co/w95/megachat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|arc:challenge|25_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|drop|3_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|gsm8k|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hellaswag|10_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-59-20.049368.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T15-59-20.049368.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- '**/details_harness|winogrande|5_2023-11-13T15-59-20.049368.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-13T15-59-20.049368.parquet'
- config_name: results
data_files:
- split: 2023_11_13T15_59_20.049368
path:
- results_2023-11-13T15-59-20.049368.parquet
- split: latest
path:
- results_2023-11-13T15-59-20.049368.parquet
---
# Dataset Card for Evaluation run of w95/megachat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/w95/megachat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [w95/megachat](https://huggingface.co/w95/megachat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_w95__megachat_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-13T15:59:20.049368](https://huggingface.co/datasets/open-llm-leaderboard/details_w95__megachat_public/blob/main/results_2023-11-13T15-59-20.049368.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25936163462487405,
"acc_stderr": 0.03091692313677521,
"acc_norm": 0.26122603283331186,
"acc_norm_stderr": 0.031692702511721224,
"mc1": 0.24479804161566707,
"mc1_stderr": 0.015051869486715014,
"mc2": 0.39854544628414945,
"mc2_stderr": 0.014106781910887378,
"em": 0.0006291946308724832,
"em_stderr": 0.00025680027497239604,
"f1": 0.041603397651006783,
"f1_stderr": 0.0011146754682383132
},
"harness|arc:challenge|25": {
"acc": 0.27047781569965873,
"acc_stderr": 0.012980954547659554,
"acc_norm": 0.30802047781569963,
"acc_norm_stderr": 0.01349142951729204
},
"harness|hellaswag|10": {
"acc": 0.4100776737701653,
"acc_stderr": 0.004908423147162023,
"acc_norm": 0.5435172276438957,
"acc_norm_stderr": 0.004970846697552307
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3092105263157895,
"acc_stderr": 0.03761070869867479,
"acc_norm": 0.3092105263157895,
"acc_norm_stderr": 0.03761070869867479
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21132075471698114,
"acc_stderr": 0.02512576648482784,
"acc_norm": 0.21132075471698114,
"acc_norm_stderr": 0.02512576648482784
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20425531914893616,
"acc_stderr": 0.026355158413349424,
"acc_norm": 0.20425531914893616,
"acc_norm_stderr": 0.026355158413349424
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.03806142687309993,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.03806142687309993
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.022860838309232072,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.022860838309232072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.024685979286239956,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.024685979286239956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.031089826002937523,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.031089826002937523
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20202020202020202,
"acc_stderr": 0.02860620428922988,
"acc_norm": 0.20202020202020202,
"acc_norm_stderr": 0.02860620428922988
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2538860103626943,
"acc_stderr": 0.03141024780565319,
"acc_norm": 0.2538860103626943,
"acc_norm_stderr": 0.03141024780565319
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2230769230769231,
"acc_stderr": 0.021107730127243995,
"acc_norm": 0.2230769230769231,
"acc_norm_stderr": 0.021107730127243995
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507383,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507383
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.181651376146789,
"acc_stderr": 0.01653061740926686,
"acc_norm": 0.181651376146789,
"acc_norm_stderr": 0.01653061740926686
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.02792096314799366,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.02792096314799366
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604243,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604243
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.028458820991460302,
"acc_norm": 0.25738396624472576,
"acc_norm_stderr": 0.028458820991460302
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.20179372197309417,
"acc_stderr": 0.026936111912802273,
"acc_norm": 0.20179372197309417,
"acc_norm_stderr": 0.026936111912802273
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728745,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728745
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.04391326286724071,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.04391326286724071
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.041331194402438376,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.041331194402438376
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.25287356321839083,
"acc_stderr": 0.015543377313719681,
"acc_norm": 0.25287356321839083,
"acc_norm_stderr": 0.015543377313719681
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.29190751445086704,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.29190751445086704,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.31790123456790126,
"acc_stderr": 0.025910063528240875,
"acc_norm": 0.31790123456790126,
"acc_norm_stderr": 0.025910063528240875
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.02635806569888059,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.02635806569888059
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27183833116036504,
"acc_stderr": 0.011363135278651411,
"acc_norm": 0.27183833116036504,
"acc_norm_stderr": 0.011363135278651411
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.17279411764705882,
"acc_stderr": 0.022966067585581756,
"acc_norm": 0.17279411764705882,
"acc_norm_stderr": 0.022966067585581756
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.017952449196987866,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.017952449196987866
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.040139645540727735,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.040139645540727735
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24897959183673468,
"acc_stderr": 0.027682979522960227,
"acc_norm": 0.24897959183673468,
"acc_norm_stderr": 0.027682979522960227
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2469879518072289,
"acc_stderr": 0.03357351982064536,
"acc_norm": 0.2469879518072289,
"acc_norm_stderr": 0.03357351982064536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3742690058479532,
"acc_stderr": 0.037116011853894806,
"acc_norm": 0.3742690058479532,
"acc_norm_stderr": 0.037116011853894806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24479804161566707,
"mc1_stderr": 0.015051869486715014,
"mc2": 0.39854544628414945,
"mc2_stderr": 0.014106781910887378
},
"harness|winogrande|5": {
"acc": 0.5698500394632992,
"acc_stderr": 0.013914685094716698
},
"harness|drop|3": {
"em": 0.0006291946308724832,
"em_stderr": 0.00025680027497239604,
"f1": 0.041603397651006783,
"f1_stderr": 0.0011146754682383132
},
"harness|gsm8k|5": {
"acc": 0.009855951478392721,
"acc_stderr": 0.0027210765770416586
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7329713702201843,
-0.854579746723175,
0.27635911107063293,
0.2113243192434311,
-0.16077928245067596,
-0.0452704094350338,
0.011303485371172428,
-0.24038369953632355,
0.6214526891708374,
-0.07961566746234894,
-0.5229402184486389,
-0.6636618375778198,
-0.4214780330657959,
0.23812237381935... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
JVictor-CC/combined_dataset | JVictor-CC | 2023-11-13T16:10:41Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T16:10:41Z | 2023-11-13T16:10:41.000Z | 2023-11-13T16:10:41 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Epharedam/xiaocsv | Epharedam | 2023-11-13T16:34:07Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T16:34:07Z | 2023-11-13T16:32:19.000Z | 2023-11-13T16:32:19 | This CSV contains voice lines up to Between Facades and Familiar Faces
Duration: January 19, 2023, 10:00:00 AM – February 06, 2023, 03:59:59 AM | [
-0.23122678697109222,
-0.6374703645706177,
0.6688750982284546,
0.8159323930740356,
0.05754617229104042,
0.04748869687318802,
0.011990291997790337,
-0.6625893115997314,
0.4459013342857361,
0.779639482498169,
-1.1097452640533447,
-0.2505073547363281,
-0.14481709897518158,
-0.1424550712108612... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
lacrisun/IndonesianFoods | lacrisun | 2023-11-13T16:36:13Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-13T16:36:13Z | 2023-11-13T16:35:00.000Z | 2023-11-13T16:35:00 | ---
license: apache-2.0
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
chemNLP/eur-lex-europa-merged | chemNLP | 2023-11-13T16:37:00Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T16:37:00Z | 2023-11-13T16:36:58.000Z | 2023-11-13T16:36:58 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2027022
num_examples: 75
download_size: 1035116
dataset_size: 2027022
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "eur-lex-europa-merged"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.7621831893920898,
-0.11954602599143982,
0.39668530225753784,
-0.06546289473772049,
-0.4044337272644043,
0.25767752528190613,
0.048333048820495605,
-0.3125816583633423,
0.8525452613830566,
0.4911116361618042,
-0.8880625367164612,
-0.898681640625,
-0.5208726525306702,
-0.17615851759910583... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
aneeshas/DNN-vuln-1timestep | aneeshas | 2023-11-13T16:58:10Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T16:58:10Z | 2023-11-13T16:58:10.000Z | 2023-11-13T16:58:10 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
matheusrdgsf/re_dial_ptbr | matheusrdgsf | 2023-11-13T18:14:31Z | 0 | 0 | null | [
"task_categories:text-classification",
"task_categories:text2text-generation",
"task_categories:conversational",
"task_categories:translation",
"size_categories:10K<n<100K",
"language:pt",
"language:en",
"license:mit",
"conversational recommendation",
"recommendation",
"conversational",
"regio... | 2023-11-13T18:14:31Z | 2023-11-13T17:20:04.000Z | 2023-11-13T17:20:04 | ---
dataset_info:
features:
- name: conversationId
dtype: int32
- name: messages
list:
- name: messageId
dtype: int64
- name: senderWorkerId
dtype: int64
- name: text
dtype: string
- name: timeOffset
dtype: int64
- name: messages_translated
list:
- name: messageId
dtype: int64
- name: senderWorkerId
dtype: int64
- name: text
dtype: string
- name: timeOffset
dtype: int64
- name: movieMentions
list:
- name: movieId
dtype: string
- name: movieName
dtype: string
- name: respondentQuestions
list:
- name: liked
dtype: int64
- name: movieId
dtype: string
- name: seen
dtype: int64
- name: suggested
dtype: int64
- name: respondentWorkerId
dtype: int32
- name: initiatorWorkerId
dtype: int32
- name: initiatorQuestions
list:
- name: liked
dtype: int64
- name: movieId
dtype: string
- name: seen
dtype: int64
- name: suggested
dtype: int64
splits:
- name: train
num_bytes: 26389658
num_examples: 9005
- name: test
num_bytes: 3755474
num_examples: 1342
download_size: 11072939
dataset_size: 30145132
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
license: mit
task_categories:
- text-classification
- text2text-generation
- conversational
- translation
language:
- pt
- en
tags:
- conversational recommendation
- recommendation
- conversational
pretty_name: ReDial (Recommendation Dialogues) PTBR
size_categories:
- 10K<n<100K
---
# Dataset Card for ReDial - PTBR
- **Original dataset:** [Redial Huggingface](https://huggingface.co/datasets/re_dial)
- **Homepage:** [ReDial Dataset](https://redialdata.github.io/website/)
- **Repository:** [ReDialData](https://github.com/ReDialData/website/tree/data)
- **Paper:** [Towards Deep Conversational Recommendations](https://proceedings.neurips.cc/paper/2018/file/800de15c79c8d840f4e78d3af937d4d4-Paper.pdf)
### Dataset Summary
The ReDial (Recommendation Dialogues) PTBR dataset is an annotated collection of dialogues where users recommend movies to each other translated to brazilian portuguese.
The adapted version of this dataset in Brazilian Portuguese was translated by the [Maritalk](https://www.maritaca.ai/). This translated version opens up opportunities fo research at the intersection of goal-directed dialogue systems (such as restaurant recommendations) and free-form, colloquial dialogue systems.
Some samples from the original dataset have been removed as we've reached the usage limit in Maritalk. Consequently, the training set has been reduced by nearly 10%.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
English and Portuguese.
## Dataset Structure
### Data Instances
```
{
"conversationId": 391,
"messages": [
{
"messageId": 1021,
"senderWorkerId": 0,
"text": "Hi there, how are you? I\'m looking for movie recommendations",
"timeOffset": 0
},
{
"messageId": 1022,
"senderWorkerId": 1,
"text": "I am doing okay. What kind of movies do you like?",
"timeOffset": 15
},
{
"messageId": 1023,
"senderWorkerId": 0,
"text": "I like animations like @84779 and @191602",
"timeOffset": 66
},
{
"messageId": 1024,
"senderWorkerId": 0,
"text": "I also enjoy @122159",
"timeOffset": 86
},
{
"messageId": 1025,
"senderWorkerId": 0,
"text": "Anything artistic",
"timeOffset": 95
},
{
"messageId": 1026,
"senderWorkerId": 1,
"text": "You might like @165710 that was a good movie.",
"timeOffset": 135
},
{
"messageId": 1027,
"senderWorkerId": 0,
"text": "What\'s it about?",
"timeOffset": 151
},
{
"messageId": 1028,
"senderWorkerId": 1,
"text": "It has Alec Baldwin it is about a baby that works for a company and gets adopted it is very funny",
"timeOffset": 207
},
{
"messageId": 1029,
"senderWorkerId": 0,
"text": "That seems like a nice comedy",
"timeOffset": 238
},
{
"messageId": 1030,
"senderWorkerId": 0,
"text": "Do you have any animated recommendations that are a bit more dramatic? Like @151313 for example",
"timeOffset": 272
},
{
"messageId": 1031,
"senderWorkerId": 0,
"text": "I like comedies but I prefer films with a little more depth",
"timeOffset": 327
},
{
"messageId": 1032,
"senderWorkerId": 1,
"text": "That is a tough one but I will remember something",
"timeOffset": 467
},
{
"messageId": 1033,
"senderWorkerId": 1,
"text": "@203371 was a good one",
"timeOffset": 509
},
{
"messageId": 1034,
"senderWorkerId": 0,
"text": "Ooh that seems cool! Thanks for the input. I\'m ready to submit if you are.",
"timeOffset": 564
},
{
"messageId": 1035,
"senderWorkerId": 1,
"text": "It is animated, sci fi, and has action",
"timeOffset": 571
},
{
"messageId": 1036,
"senderWorkerId": 1,
"text": "Glad I could help",
"timeOffset": 579
},
{
"messageId": 1037,
"senderWorkerId": 0,
"text": "Nice",
"timeOffset": 581
},
{
"messageId": 1038,
"senderWorkerId": 0,
"text": "Take care, cheers!",
"timeOffset": 591
},
{
"messageId": 1039,
"senderWorkerId": 1,
"text": "bye",
"timeOffset": 608
}
],
"messages_translated": [
{
"messageId": 1021,
"senderWorkerId": 0,
"text": "Olá, como você está? Estou procurando recomendações de filmes.",
"timeOffset": 0
},
{
"messageId": 1022,
"senderWorkerId": 1,
"text": "Eu estou indo bem. Qual tipo de filmes você gosta?",
"timeOffset": 15
},
{
"messageId": 1023,
"senderWorkerId": 0,
"text": "Eu gosto de animações como @84779 e @191602.",
"timeOffset": 66
},
{
"messageId": 1024,
"senderWorkerId": 0,
"text": "Eu também gosto de @122159.",
"timeOffset": 86
},
{
"messageId": 1025,
"senderWorkerId": 0,
"text": "Qualquer coisa artística",
"timeOffset": 95
},
{
"messageId": 1026,
"senderWorkerId": 1,
"text": "Você pode gostar de saber que foi um bom filme.",
"timeOffset": 135
},
{
"messageId": 1027,
"senderWorkerId": 0,
"text": "O que é isso?",
"timeOffset": 151
},
{
"messageId": 1028,
"senderWorkerId": 1,
"text": "Tem um bebê que trabalha para uma empresa e é adotado. É muito engraçado.",
"timeOffset": 207
},
{
"messageId": 1029,
"senderWorkerId": 0,
"text": "Isso parece ser uma comédia legal.",
"timeOffset": 238
},
{
"messageId": 1030,
"senderWorkerId": 0,
"text": "Você tem alguma recomendação animada que seja um pouco mais dramática, como por exemplo @151313?",
"timeOffset": 272
},
{
"messageId": 1031,
"senderWorkerId": 0,
"text": "Eu gosto de comédias, mas prefiro filmes com um pouco mais de profundidade.",
"timeOffset": 327
},
{
"messageId": 1032,
"senderWorkerId": 1,
"text": "Isso é um desafio, mas eu me lembrarei de algo.",
"timeOffset": 467
},
{
"messageId": 1033,
"senderWorkerId": 1,
"text": "@203371 Foi um bom dia.",
"timeOffset": 509
},
{
"messageId": 1034,
"senderWorkerId": 0,
"text": "Ah, parece legal! Obrigado pela contribuição. Estou pronto para enviar se você estiver.",
"timeOffset": 564
},
{
"messageId": 1035,
"senderWorkerId": 1,
"text": "É animado, de ficção científica e tem ação.",
"timeOffset": 571
},
{
"messageId": 1036,
"senderWorkerId": 1,
"text": "Fico feliz em poder ajudar.",
"timeOffset": 579
},
{
"messageId": 1037,
"senderWorkerId": 0,
"text": "Legal",
"timeOffset": 581
},
{
"messageId": 1038,
"senderWorkerId": 0,
"text": "Cuide-se, abraços!",
"timeOffset": 591
},
{
"messageId": 1039,
"senderWorkerId": 1,
"text": "Adeus",
"timeOffset": 608
}
],
"movieMentions": [
{
"movieId": "203371",
"movieName": "Final Fantasy: The Spirits Within (2001)"
},
{
"movieId": "84779",
"movieName": "The Triplets of Belleville (2003)"
},
{
"movieId": "122159",
"movieName": "Mary and Max (2009)"
},
{
"movieId": "151313",
"movieName": "A Scanner Darkly (2006)"
},
{
"movieId": "191602",
"movieName": "Waking Life (2001)"
},
{
"movieId": "165710",
"movieName": "The Boss Baby (2017)"
}
],
"respondentQuestions": [
{
"liked": 1,
"movieId": "203371",
"seen": 0,
"suggested": 1
},
{
"liked": 1,
"movieId": "84779",
"seen": 1,
"suggested": 0
},
{
"liked": 1,
"movieId": "122159",
"seen": 1,
"suggested": 0
},
{
"liked": 1,
"movieId": "151313",
"seen": 1,
"suggested": 0
},
{
"liked": 1,
"movieId": "191602",
"seen": 1,
"suggested": 0
},
{
"liked": 1,
"movieId": "165710",
"seen": 0,
"suggested": 1
}
],
"respondentWorkerId": 1,
"initiatorWorkerId": 0,
"initiatorQuestions": [
{
"liked": 1,
"movieId": "203371",
"seen": 0,
"suggested": 1
},
{
"liked": 1,
"movieId": "84779",
"seen": 1,
"suggested": 0
},
{
"liked": 1,
"movieId": "122159",
"seen": 1,
"suggested": 0
},
{
"liked": 1,
"movieId": "151313",
"seen": 1,
"suggested": 0
},
{
"liked": 1,
"movieId": "191602",
"seen": 1,
"suggested": 0
},
{
"liked": 1,
"movieId": "165710",
"seen": 0,
"suggested": 1
}
]
}
```
### Data Fields
The dataset is published in the “jsonl” format, i.e., as a text file where each line corresponds to a Dialogue given as a valid JSON document.
A Dialogue contains these fields:
**conversationId:** an integer
**initiatorWorkerId:** an integer identifying to the worker initiating the conversation (the recommendation seeker)
**respondentWorkerId:** an integer identifying the worker responding to the initiator (the recommender)
**messages:** a list of Message objects
**messages_translated:** a list of Message objects
**movieMentions:** a dict mapping movie IDs mentioned in this dialogue to movie names
**initiatorQuestions:** a dictionary mapping movie IDs to the labels supplied by the initiator. Each label is a bool corresponding to whether the initiator has said he saw the movie, liked it, or suggested it.
**respondentQuestions:** a dictionary mapping movie IDs to the labels supplied by the respondent. Each label is a bool corresponding to whether the initiator has said he saw the movie, liked it, or suggested it.
Each Message of **messages** contains these fields:
**messageId:** a unique ID for this message
**text:** a string with the actual message. The string may contain a token starting with @ followed by an integer. This is a movie ID which can be looked up in the movieMentions field of the Dialogue object.
**timeOffset:** time since start of dialogue in seconds
**senderWorkerId:** the ID of the worker sending the message, either initiatorWorkerId or respondentWorkerId.
Each Message of **messages_translated** contains the same struct with the text translated to portuguese.
The labels in initiatorQuestions and respondentQuestions have the following meaning:
*suggested:* 0 if it was mentioned by the seeker, 1 if it was a suggestion from the recommender
*seen:* 0 if the seeker has not seen the movie, 1 if they have seen it, 2 if they did not say
*liked:* 0 if the seeker did not like the movie, 1 if they liked it, 2 if they did not say
### Data Splits
The original dataset contains a total of 11348 dialogues, 10006 for training and model selection, and 1342 for testing.
This translated version has near values but 10% reduced in train split.
### Contributions
This work have has done by [matheusrdg](https://github.com/matheusrdg) and [wfco](https://github.com/willianfco).
The translation of this dataset was made possible thanks to the Maritalk API.
| [
-0.374540239572525,
-0.5263271927833557,
0.23420198261737823,
0.44134703278541565,
-0.5740745663642883,
0.1976139396429062,
0.07068448513746262,
-0.17412522435188293,
0.9599647521972656,
0.34013983607292175,
-0.7890980839729309,
-0.7138379812240601,
-0.6959463953971863,
0.02069402486085891... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
hugosousa/ProfessorHeidelTime | hugosousa | 2023-11-13T17:43:54Z | 0 | 0 | null | [
"task_categories:token-classification",
"task_ids:parsing",
"task_ids:part-of-speech",
"task_ids:named-entity-recognition",
"annotations_creators:machine-generated",
"language_creators:found",
"multilinguality:multilingual",
"size_categories:100K<n<1M",
"source_datasets:original",
"language:en",
... | 2023-11-13T17:43:54Z | 2023-11-13T17:31:13.000Z | 2023-11-13T17:31:13 | ---
annotations_creators:
- machine-generated
language:
- en
- fr
- pt
- de
- fr
- it
- es
language_creators:
- found
license:
- mit
multilinguality:
- multilingual
pretty_name: Professor HeidelTime
size_categories:
- 100K<n<1M
source_datasets:
- original
tags:
- Timex
- Timexs
- Temporal Expression
- Temporal Expressions
- Temporal Information
- Timex Identification
- Timex Classification
- Timex Extraction
task_categories:
- token-classification
task_ids:
- parsing
- part-of-speech
- named-entity-recognition
configs:
- config_name: portuguese
data_files: "portuguese.json"
- config_name: english
data_files: "english.json"
- config_name: french
data_files: "french.json"
- config_name: italian
data_files: "italian.json"
- config_name: spanish
data_files: "spanish.json"
- config_name: german
data_files: "german.json"
---
# Professor HeidelTime
[Paper](https://dl.acm.org/doi/10.1145/3583780.3615130) [GitHub](https://github.com/hmosousa/professor_heideltime)
Professor HeidelTime is a project to create a multilingual corpus weakly labeled with [HeidelTime](https://github.com/HeidelTime/heideltime), a temporal tagger.
## Corpus Details
The weak labeling was performed in six languages. Here are the specifics of the corpus for each language:
| Dataset | Language | Documents | From | To | Tokens | Timexs |
| ----------------------- | -------- | --------- | ---------- | ---------- | ---------- | -------- |
| All the News 2.0 | EN | 24,642 | 2016-01-01 | 2020-04-02 | 18,755,616 | 254,803 |
| Italian Crime News | IT | 9,619 | 2011-01-01 | 2021-12-31 | 3,296,898 | 58,823 |
| German News Dataset | DE | 33,266 | 2003-01-01 | 2022-12-31 | 21,617,888 | 348,011 |
| ElMundo News | ES | 19,095 | 2005-12-02 | 2021-10-18 | 12,515,410 | 194,043 |
| French Financial News | FR | 24,293 | 2017-10-19 | 2021-03-19 | 1,673,053 | 83,431 |
| Público News | PT | 27,154 | 2000-11-14 | 2002-03-20 | 5,929,377 | 111,810 |
## Contact
For more information, reach out to [Hugo Sousa](https://hugosousa.net) at <hugo.o.sousa@inesctec.pt>.
This framework is a part of the [Text2Story](https://text2story.inesctec.pt) project. This project is financed by the ERDF – European Regional Development Fund through the North Portugal Regional Operational Programme (NORTE 2020), under the PORTUGAL 2020 and by National Funds through the Portuguese funding agency, FCT - Fundação para a Ciência e a Tecnologia within project PTDC/CCI-COM/31857/2017 (NORTE-01-0145-FEDER-03185).
## Cite
If you use this work, please cite the following [paper](https://dl.acm.org/doi/10.1145/3583780.3615130):
```bibtex
@inproceedings{10.1145/3583780.3615130,
author = {Sousa, Hugo and Campos, Ricardo and Jorge, Al\'{\i}pio},
title = {TEI2GO: A Multilingual Approach for Fast Temporal Expression Identification},
year = {2023},
isbn = {9798400701245},
publisher = {Association for Computing Machinery},
url = {https://doi.org/10.1145/3583780.3615130},
doi = {10.1145/3583780.3615130},
booktitle = {Proceedings of the 32nd ACM International Conference on Information and Knowledge Management},
pages = {5401–5406},
numpages = {6},
keywords = {temporal expression identification, multilingual corpus, weak label},
location = {Birmingham, United Kingdom},
series = {CIKM '23}
}
```
| [
-0.4363786578178406,
-0.46584418416023254,
0.42467665672302246,
0.2732781171798706,
-0.24884441494941711,
0.07264570146799088,
-0.40659162402153015,
-0.7688749432563782,
0.26500147581100464,
0.005103608127683401,
-0.268926739692688,
-0.8204230666160583,
-0.7081337571144104,
0.0613479018211... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
cc-platform-links/10_RANDOM_WAT_FILTERED_LINKS | cc-platform-links | 2023-11-13T17:51:54Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T17:51:54Z | 2023-11-13T17:51:47.000Z | 2023-11-13T17:51:47 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: url
dtype: string
splits:
- name: train
num_bytes: 1505397.4479696343
num_examples: 19940
download_size: 474964
dataset_size: 1505397.4479696343
---
# Dataset Card for "10_RANDOM_WAT_FILTERED_LINKS"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.6881927847862244,
-0.4028848111629486,
0.4692244529724121,
0.13034909963607788,
-0.40874719619750977,
-0.3306690454483032,
0.17621171474456787,
-0.24109435081481934,
0.8308917284011841,
0.5839653015136719,
-0.998060405254364,
-0.7888685464859009,
-0.5211673974990845,
0.02451252937316894... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
dpquoc/wiki-faiss-tmp | dpquoc | 2023-11-13T18:18:50Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T18:18:50Z | 2023-11-13T18:11:24.000Z | 2023-11-13T18:11:24 | Entry not found | [
-0.32276472449302673,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965679168701,
0.7915717363357544,
0.07618629932403564,
0.7746022939682007,
0.2563222646713257,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
malucoelhaofc/ScottTenormanV2 | malucoelhaofc | 2023-11-16T12:24:40Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-16T12:24:40Z | 2023-11-13T18:12:10.000Z | 2023-11-13T18:12:10 | ---
license: openrail
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
hippocrates/MedicalTranscriptions_test | hippocrates | 2023-11-13T18:51:25Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T18:51:25Z | 2023-11-13T18:51:22.000Z | 2023-11-13T18:51:22 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
splits:
- name: train
num_bytes: 24235832
num_examples: 4999
- name: valid
num_bytes: 2251838
num_examples: 500
- name: test
num_bytes: 5226991
num_examples: 999
download_size: 10111285
dataset_size: 31714661
---
# Dataset Card for "MedicalTranscriptions_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.32103848457336426,
-0.2557121515274048,
0.30447322130203247,
0.213673397898674,
-0.19997364282608032,
-0.013373302295804024,
0.2576560974121094,
-0.23290219902992249,
0.9656378030776978,
0.36609870195388794,
-1.0233798027038574,
-0.6761938333511353,
-0.7365994453430176,
-0.0733794718980... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Pablao0948/Steve_Pizza | Pablao0948 | 2023-11-13T19:26:30Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-13T19:26:30Z | 2023-11-13T19:26:10.000Z | 2023-11-13T19:26:10 | ---
license: openrail
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Suqqaro/mokcomedy-voice | Suqqaro | 2023-11-13T19:51:50Z | 0 | 0 | null | [
"license:unknown",
"region:us"
] | 2023-11-13T19:51:50Z | 2023-11-13T19:47:18.000Z | 2023-11-13T19:47:18 | ---
license: unknown
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
jiminHuang/clinical_ner_train | jiminHuang | 2023-11-13T19:50:24Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T19:50:24Z | 2023-11-13T19:50:16.000Z | 2023-11-13T19:50:16 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 8527346
num_examples: 10661
- name: valid
num_bytes: 4951299
num_examples: 6254
- name: test
num_bytes: 5307591
num_examples: 6806
download_size: 5455050
dataset_size: 18786236
---
# Dataset Card for "clinical_ner_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.3914356231689453,
-0.0938318744301796,
0.2528924345970154,
0.06740779429674149,
0.04897588491439819,
-0.19163772463798523,
0.26464614272117615,
-0.06304702162742615,
0.8568163514137268,
0.4684004485607147,
-0.7623777389526367,
-0.7072303295135498,
-0.7789530158042908,
-0.194201663136482... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_jondurbin__airoboros-m-7b-3.1.2_public | open-llm-leaderboard | 2023-11-13T19:55:52Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T19:55:52Z | 2023-11-13T19:55:07.000Z | 2023-11-13T19:55:07 | ---
pretty_name: Evaluation run of jondurbin/airoboros-m-7b-3.1.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-m-7b-3.1.2](https://huggingface.co/jondurbin/airoboros-m-7b-3.1.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-m-7b-3.1.2_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-13T19:52:08.394828](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-m-7b-3.1.2_public/blob/main/results_2023-11-13T19-52-08.394828.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6135698931222068,\n\
\ \"acc_stderr\": 0.032663709384362964,\n \"acc_norm\": 0.6227233835131805,\n\
\ \"acc_norm_stderr\": 0.033389385625867025,\n \"mc1\": 0.3623011015911873,\n\
\ \"mc1_stderr\": 0.016826646897262258,\n \"mc2\": 0.5374874453696802,\n\
\ \"mc2_stderr\": 0.015091604419760369,\n \"em\": 0.352873322147651,\n\
\ \"em_stderr\": 0.004893771826676391,\n \"f1\": 0.41195889261745017,\n\
\ \"f1_stderr\": 0.004738382745022343\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5921501706484642,\n \"acc_stderr\": 0.014361097288449696,\n\
\ \"acc_norm\": 0.6186006825938567,\n \"acc_norm_stderr\": 0.014194389086685256\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6340370444134634,\n\
\ \"acc_stderr\": 0.0048071469251620555,\n \"acc_norm\": 0.8350926110336586,\n\
\ \"acc_norm_stderr\": 0.0037033852685121726\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013317,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013317\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.02906722014664483,\n\
\ \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.02906722014664483\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n\
\ \"acc_stderr\": 0.03899073687357335,\n \"acc_norm\": 0.6805555555555556,\n\
\ \"acc_norm_stderr\": 0.03899073687357335\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396264,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396264\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7225806451612903,\n\
\ \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.7225806451612903,\n\
\ \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.024537591572830503,\n\
\ \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.024537591572830503\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683522,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683522\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010347,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010347\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294407006,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294407006\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265012,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265012\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217892,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217892\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.026664410886937617,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.026664410886937617\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294674,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294674\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594106,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594106\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4445893089960887,\n\
\ \"acc_stderr\": 0.012691575792657117,\n \"acc_norm\": 0.4445893089960887,\n\
\ \"acc_norm_stderr\": 0.012691575792657117\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6470588235294118,\n \"acc_stderr\": 0.01933314202079716,\n \
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.01933314202079716\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982066,\n\
\ \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982066\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3623011015911873,\n\
\ \"mc1_stderr\": 0.016826646897262258,\n \"mc2\": 0.5374874453696802,\n\
\ \"mc2_stderr\": 0.015091604419760369\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774108\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.352873322147651,\n \
\ \"em_stderr\": 0.004893771826676391,\n \"f1\": 0.41195889261745017,\n \
\ \"f1_stderr\": 0.004738382745022343\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.13874147081122062,\n \"acc_stderr\": 0.009521649920798146\n\
\ }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-m-7b-3.1.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|arc:challenge|25_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|drop|3_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|gsm8k|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hellaswag|10_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|winogrande|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-13T19-52-08.394828.parquet'
- config_name: results
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- results_2023-11-13T19-52-08.394828.parquet
- split: latest
path:
- results_2023-11-13T19-52-08.394828.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-m-7b-3.1.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-m-7b-3.1.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-m-7b-3.1.2](https://huggingface.co/jondurbin/airoboros-m-7b-3.1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-m-7b-3.1.2_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-13T19:52:08.394828](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-m-7b-3.1.2_public/blob/main/results_2023-11-13T19-52-08.394828.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6135698931222068,
"acc_stderr": 0.032663709384362964,
"acc_norm": 0.6227233835131805,
"acc_norm_stderr": 0.033389385625867025,
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262258,
"mc2": 0.5374874453696802,
"mc2_stderr": 0.015091604419760369,
"em": 0.352873322147651,
"em_stderr": 0.004893771826676391,
"f1": 0.41195889261745017,
"f1_stderr": 0.004738382745022343
},
"harness|arc:challenge|25": {
"acc": 0.5921501706484642,
"acc_stderr": 0.014361097288449696,
"acc_norm": 0.6186006825938567,
"acc_norm_stderr": 0.014194389086685256
},
"harness|hellaswag|10": {
"acc": 0.6340370444134634,
"acc_stderr": 0.0048071469251620555,
"acc_norm": 0.8350926110336586,
"acc_norm_stderr": 0.0037033852685121726
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013317,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013317
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.02906722014664483,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.02906722014664483
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.03899073687357335,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.03899073687357335
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396264,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396264
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.02516798233389414,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.02516798233389414
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7225806451612903,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.7225806451612903,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.024537591572830503,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.024537591572830503
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683522,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683522
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010347,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010347
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407006,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407006
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.024027745155265012,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.024027745155265012
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217892,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217892
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.026664410886937617,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.026664410886937617
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294674,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294674
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.029609912075594106,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.029609912075594106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4445893089960887,
"acc_stderr": 0.012691575792657117,
"acc_norm": 0.4445893089960887,
"acc_norm_stderr": 0.012691575792657117
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.01933314202079716,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.01933314202079716
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982066,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982066
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.032659863237109066,
"acc_norm": 0.88,
"acc_norm_stderr": 0.032659863237109066
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262258,
"mc2": 0.5374874453696802,
"mc2_stderr": 0.015091604419760369
},
"harness|winogrande|5": {
"acc": 0.7758484609313339,
"acc_stderr": 0.011720400740774108
},
"harness|drop|3": {
"em": 0.352873322147651,
"em_stderr": 0.004893771826676391,
"f1": 0.41195889261745017,
"f1_stderr": 0.004738382745022343
},
"harness|gsm8k|5": {
"acc": 0.13874147081122062,
"acc_stderr": 0.009521649920798146
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7398000955581665,
-0.8215371370315552,
0.23771849274635315,
0.21225780248641968,
-0.18368686735630035,
-0.08218743652105331,
0.020512335002422333,
-0.22881628572940826,
0.5824134945869446,
-0.01127379760146141,
-0.4801618456840515,
-0.7065052390098572,
-0.4579450786113739,
0.23686762154... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Norod78/muppetshow-blip-captions | Norod78 | 2023-11-13T20:39:50Z | 0 | 2 | null | [
"region:us"
] | 2023-11-13T20:39:50Z | 2023-11-13T20:39:01.000Z | 2023-11-13T20:39:01 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 495380043.0
num_examples: 402
download_size: 495385822
dataset_size: 495380043.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "muppetshow-blip-captions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.4537140130996704,
-0.04368560016155243,
-0.005875071510672569,
0.5098778009414673,
-0.3775025010108948,
0.318280965089798,
-0.0016666267765685916,
-0.16033300757408142,
0.9865875244140625,
0.48696619272232056,
-0.7762053608894348,
-0.5179480910301208,
-0.6089905500411987,
-0.14981780946... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
fabiancelik/coachingllm | fabiancelik | 2023-11-19T17:38:14Z | 0 | 0 | null | [
"task_categories:question-answering",
"task_categories:text-generation",
"task_categories:conversational",
"size_categories:n<1K",
"language:en",
"license:apache-2.0",
"coaching",
"questions",
"region:us"
] | 2023-11-19T17:38:14Z | 2023-11-13T20:39:33.000Z | 2023-11-13T20:39:33 | ---
license: apache-2.0
task_categories:
- question-answering
- text-generation
- conversational
language:
- en
tags:
- coaching
- questions
size_categories:
- n<1K
---
# Dataset Card for coachingllm
<!-- Provide a quick summary of the dataset. -->
Collection of Coaching questions.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** Fabian Celik
- **Language(s) (NLP):** en
- **License:** apache-2.0 | [
-0.605549156665802,
-0.6020761728286743,
-0.09082957357168198,
0.4159540832042694,
-0.44397687911987305,
0.07348216325044632,
-0.2775108814239502,
-0.37490689754486084,
0.3457145690917969,
0.5269051194190979,
-1.289382815361023,
-0.9514240622520447,
-0.6832124590873718,
0.12958306074142456... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Aspik101/train_thumbnails | Aspik101 | 2023-11-13T20:50:32Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T20:50:32Z | 2023-11-13T20:50:32.000Z | 2023-11-13T20:50:32 | Entry not found | [
-0.32276472449302673,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965679168701,
0.7915717363357544,
0.07618629932403564,
0.7746022939682007,
0.2563222646713257,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Aspik101/train_thumbnails2 | Aspik101 | 2023-11-13T21:18:19Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T21:18:19Z | 2023-11-13T20:51:13.000Z | 2023-11-13T20:51:13 | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 3646466821.0
num_examples: 513
download_size: 3646525869
dataset_size: 3646466821.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "train_thumbnails2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.4664071798324585,
-0.0682167038321495,
0.21737012267112732,
0.35182344913482666,
-0.4081069529056549,
-0.14375880360603333,
0.26151251792907715,
-0.04575556144118309,
0.7312042117118835,
0.35108625888824463,
-0.7688372135162354,
-0.4809204041957855,
-0.7141423225402832,
-0.4168236255645... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
DmitrMakeev/Deforum-file | DmitrMakeev | 2023-11-19T17:42:00Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-19T17:42:00Z | 2023-11-13T20:58:25.000Z | 2023-11-13T20:58:25 | ---
license: openrail
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
wrdias/midias | wrdias | 2023-11-13T21:14:45Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T21:14:45Z | 2023-11-13T21:13:44.000Z | 2023-11-13T21:13:44 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Aspik101/UBCOCEAN | Aspik101 | 2023-11-13T21:19:53Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T21:19:53Z | 2023-11-13T21:19:53.000Z | 2023-11-13T21:19:53 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Pablao0948/Locutor_SBT | Pablao0948 | 2023-11-13T22:54:53Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-13T22:54:53Z | 2023-11-13T22:54:26.000Z | 2023-11-13T22:54:26 | ---
license: openrail
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
kik41/Rolellm | kik41 | 2023-11-13T23:33:12Z | 0 | 0 | null | [
"region:us"
] | 2023-11-13T23:33:12Z | 2023-11-13T23:30:47.000Z | 2023-11-13T23:30:47 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
malucoelhaofc/ScottTenorman201V2 | malucoelhaofc | 2023-11-14T00:02:41Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-14T00:02:41Z | 2023-11-14T00:01:58.000Z | 2023-11-14T00:01:58 | ---
license: openrail
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
MetalMace/MTG-CardArt | MetalMace | 2023-11-14T01:05:34Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-14T01:05:34Z | 2023-11-14T01:05:34.000Z | 2023-11-14T01:05:34 | ---
license: mit
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
AlphaNeil/AIart2 | AlphaNeil | 2023-11-14T01:18:15Z | 0 | 0 | null | [
"region:us"
] | 2023-11-14T01:18:15Z | 2023-11-14T01:18:15.000Z | 2023-11-14T01:18:15 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.