id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
soda-lmu/tweet-annotation-experiments | 2023-09-14T17:18:22.000Z | [
"task_categories:text-classification",
"task_ids:sentiment-classification",
"task_ids:hate-speech-detection",
"size_categories:10K<n<100K",
"language:en",
"region:us"
] | soda-lmu | null | null | null | 0 | 0 | ---
task_categories:
- text-classification
language:
- en
task_ids:
- sentiment-classification
- hate-speech-detection
size_categories:
- 10K<n<100K
---
# Tweet Annotations in Five Experimental Conditions
## Description
The dataset contains tweet data annotations of **hate speech** (HS) and **offensive language** (OL) in five experimental conditions. The tweet data was sampled from the corpus created by [Davidson et al. (2017)](https://ojs.aaai.org/index.php/ICWSM/article/view/14955). We selected 3,000 Tweets for our annotation. We developed five experimental conditions that varied the annotation task structure, as shown in the following figure. All tweets were annotated in each condition.
- **<font color= #871F78>Condition A</font>** presented the tweet and three options on a single screen: hate speech, offensive language, or neither. Annotators could select one or both of hate speech, offensive language, or indicate that neither applied.
- Conditions B and C split the annotation of a single tweet across two screens.
+ For **<font color= Blue>Condition B</font>**, the first screen prompted the annotator to indicate whether the tweet contained hate speech. On the following screen, they were shown the tweet again and asked whether it contained offensive language.
+ **<font color= red>Condition C</font>** was similar to Condition B, but flipped the order of hate speech and offensive language for each tweet.
- In Conditions D and E, the two tasks are treated independently with annotators being asked to first annotate all tweets for one task, followed by annotating all tweets again for the second task.
+ Annotators assigned **<font color=green>Condition D</font>** were first asked to annotate hate speech for all their assigned tweets, and then asked to annotate offensive language for the same set of tweets.
+ **Condition E** worked the same way, but started with the offensive language annotation task followed by the hate speech annotation task.
We recruited US-based annotators from the crowdsourcing platform [Prolific](https://www.prolific.com/) during November and December 2022. Each annotator annotated up to 50 tweets. The dataset also contains demographic information about the annotators. Annotators received a fixed hourly wage in excess of the US federal minimum wage after completing the task.
<img src="https://datasets-server.huggingface.co/assets/soda-lmu/tweet_test/--/default/train/0/image/image.jpg" width = "300" height = "200" alt="" align=center />
## Dataset Structure
| Column Name | Description | Type |
| -------------- | ------------------ |---------------- |
| case_id | case ID | integer |
| duration_seconds | duration of connection to task in seconds | integer |
| last_screen | last question answered | factor |
| device | device type | factor |
| ethn_hispanic | Hispanic race/ethnicity | binary |
| ethn_white | White race/ethnicity | binary |
| ethn_afr_american | African-American race/ethnicity | binary |
| ethn_asian | Asian race/ethnicity | binary |
| ethn_sth_else | race/ethnicity something else | binary |
| ethn_prefer_not | race/ethnicity prefer not to say | binary |
| age | age | integer |
| education | education attainment <br>1: Less than high school <br>2: High school <br>3: Some college <br>4: College graduate <br>5: Master's degree or professional degree (law, medicine, MPH, etc.) <br>6: Doctoral degree (PhD, DPH, EdD, etc.)| factor |
| english_fl | English as first language | binary |
| twitter_use | Twitter use frequency <br>1: Most days <br>2: Most weeks, but not every day <br>3: A few times a month <br>4: A few times a year <br>5: Less often <br>6: Never | factor |
| socmedia_use | social media use frequency <br>1: Most days <br>2: Most weeks, but not every day <br>3: A few times a month <br>4: A few times a year <br>5: Less often <br>6: Never | factor |
| prolific_hours | workload on the platform prolific in hours in the last month | integer |
| task_fun | task perception: fun | binary |
| task_interesting | task perception: interesting | binary |
| task_boring | task perception: boring | binary |
| task_repetitive | task perception: repetitive | binary |
| task_important | task perception: important | binary |
| task_depressing | task perception: depressing | binary |
| task_offensive | task perception: offensive | binary |
| repeat_tweet_coding | likelihood for another tweet task <br>1: Not at all likely <br>2: Somewhat likely <br>3: Very likely | factor |
| repeat_hs_coding | likelihood for another hate speech task <br>1: Not at all likely <br>2: Somewhat likely <br>3: Very likely | factor |
| target_online_harassment | targeted by hateful online behavior | binary |
| target_other_harassment | targeted by other hateful behavior | binary |
| party_affiliation | party identification <br>1: Republican <br>2: Democrat <br>3: Independent | factor |
| societal_relevance_hs | relevance perception of hate speech <br>1: Not at all likely <br>2: Somewhat likely <br>3: Very likely | factor |
| annotator_id | annotator ID | integer |
| condition | experimental conditions (A-E) | factor |
| tweet_batch | tweet ID in batch | factor |
| hate_speech | hate speech annotation | logical |
| offensive_language | offensive language annotation | logical |
| tweet_id | tweet ID | integer |
| orig_label_hs | number of persons who annotated the tweet as hate speech in the original dataset from [Davidson et al. (2017)](https://ojs.aaai.org/index.php/ICWSM/article/view/14955) | integer |
| orig_label_ol | number of persons who annotated the tweet as offensive language in the original dataset from [Davidson et al. (2017)](https://ojs.aaai.org/index.php/ICWSM/article/view/14955) | integer |
| orig_label_ne | number of persons who annotated the tweet as neither in the original dataset from [Davidson et al. (2017)](https://ojs.aaai.org/index.php/ICWSM/article/view/14955) | integer |
| tweet_hashed | tweet with usernames hashed | character |
## Citation
If you find the dataset useful, please cite our dataset page: [https://huggingface.co/datasets/soda-lmu/tweet-annotation-experiments](https://huggingface.co/datasets/soda-lmu/tweet-annotation-experiments).
|
CyberHarem/anisphia_wynn_palettia_tenseioujototensaireijounomahoukakumei | 2023-09-17T17:35:36.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Anisphia Wynn Palettia
This is the dataset of Anisphia Wynn Palettia, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 616 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 616 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 616 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 616 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
islamogenius/islamogenius1 | 2023-09-13T14:41:49.000Z | [
"region:us"
] | islamogenius | null | null | null | 0 | 0 | Entry not found |
KimleangSama/dataset | 2023-09-13T14:43:47.000Z | [
"region:us"
] | KimleangSama | null | null | null | 0 | 0 | Entry not found |
aboix/q76_campnow_downsampled_noduplicates | 2023-09-13T15:45:11.000Z | [
"region:us"
] | aboix | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: inputs
struct:
- name: text
dtype: string
- name: prediction
dtype: 'null'
- name: prediction_agent
dtype: 'null'
- name: annotation
sequence: string
- name: annotation_agent
dtype: string
- name: vectors
dtype: 'null'
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: string
- name: metadata
struct:
- name: split
dtype: string
- name: status
dtype: string
- name: event_timestamp
dtype: timestamp[us]
- name: metrics
struct:
- name: text_length
dtype: int64
splits:
- name: train
num_bytes: 165389.7949951877
num_examples: 831
- name: test
num_bytes: 41397.20500481232
num_examples: 208
download_size: 105177
dataset_size: 206787.0
---
# Dataset Card for "q76_campnow_downsampled_noduplicates"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Benson/dolly_dialog_subset | 2023-09-13T15:32:31.000Z | [
"language:en",
"region:us"
] | Benson | null | null | null | 0 | 0 | ---
language:
- en
--- |
CyberHarem/euphyllia_magenta_tenseioujototensaireijounomahoukakumei | 2023-09-17T17:35:38.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Euphyllia Magenta
This is the dataset of Euphyllia Magenta, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 635 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 635 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 635 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 635 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/abe_nana_idolmastercinderellagirls | 2023-09-17T17:35:40.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of abe_nana (THE iDOLM@STER: Cinderella Girls)
This is the dataset of abe_nana (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 521 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 521 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 521 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 521 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_PygmalionAI__mythalion-13b | 2023-09-13T15:45:16.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PygmalionAI/mythalion-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PygmalionAI/mythalion-13b](https://huggingface.co/PygmalionAI/mythalion-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PygmalionAI__mythalion-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T15:43:56.959580](https://huggingface.co/datasets/open-llm-leaderboard/details_PygmalionAI__mythalion-13b/blob/main/results_2023-09-13T15-43-56.959580.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5668139361802185,\n\
\ \"acc_stderr\": 0.03433655004935063,\n \"acc_norm\": 0.5706940243762324,\n\
\ \"acc_norm_stderr\": 0.03431449412200888,\n \"mc1\": 0.3219094247246022,\n\
\ \"mc1_stderr\": 0.016355567611960404,\n \"mc2\": 0.46562168990109065,\n\
\ \"mc2_stderr\": 0.015291610692060842\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5767918088737202,\n \"acc_stderr\": 0.014438036220848032,\n\
\ \"acc_norm\": 0.6126279863481229,\n \"acc_norm_stderr\": 0.014235872487909869\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.644991037641904,\n\
\ \"acc_stderr\": 0.004775380866948015,\n \"acc_norm\": 0.8380800637323242,\n\
\ \"acc_norm_stderr\": 0.0036762448867232646\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526066,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526066\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874142,\n\
\ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874142\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.0379401267469703,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.0379401267469703\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.024229965298425075,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.024229965298425075\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6387096774193548,\n\
\ \"acc_stderr\": 0.02732754844795754,\n \"acc_norm\": 0.6387096774193548,\n\
\ \"acc_norm_stderr\": 0.02732754844795754\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.03499113137676744,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.03499113137676744\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"\
acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.028697873971860688,\n\
\ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.028697873971860688\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.517948717948718,\n \"acc_stderr\": 0.025334667080954925,\n \
\ \"acc_norm\": 0.517948717948718,\n \"acc_norm_stderr\": 0.025334667080954925\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7394495412844037,\n \"acc_stderr\": 0.01881918203485007,\n \"\
acc_norm\": 0.7394495412844037,\n \"acc_norm_stderr\": 0.01881918203485007\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.033247089118091176,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.033247089118091176\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326467,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326467\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335435,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335435\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7713920817369093,\n\
\ \"acc_stderr\": 0.015016884698539866,\n \"acc_norm\": 0.7713920817369093,\n\
\ \"acc_norm_stderr\": 0.015016884698539866\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.025816756791584194,\n\
\ \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.025816756791584194\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4581005586592179,\n\
\ \"acc_stderr\": 0.016663683295020527,\n \"acc_norm\": 0.4581005586592179,\n\
\ \"acc_norm_stderr\": 0.016663683295020527\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.027530078447110303,\n\
\ \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.027530078447110303\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.639871382636656,\n\
\ \"acc_stderr\": 0.027264297599804012,\n \"acc_norm\": 0.639871382636656,\n\
\ \"acc_norm_stderr\": 0.027264297599804012\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.026915003011380154,\n\
\ \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.026915003011380154\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.0293922365846125,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.0293922365846125\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43089960886571055,\n\
\ \"acc_stderr\": 0.012647695889547228,\n \"acc_norm\": 0.43089960886571055,\n\
\ \"acc_norm_stderr\": 0.012647695889547228\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03032024326500413,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03032024326500413\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5915032679738562,\n \"acc_stderr\": 0.019886221037501862,\n \
\ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.019886221037501862\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726496,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3219094247246022,\n\
\ \"mc1_stderr\": 0.016355567611960404,\n \"mc2\": 0.46562168990109065,\n\
\ \"mc2_stderr\": 0.015291610692060842\n }\n}\n```"
repo_url: https://huggingface.co/PygmalionAI/mythalion-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|arc:challenge|25_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hellaswag|10_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T15-43-56.959580.parquet'
- config_name: results
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- results_2023-09-13T15-43-56.959580.parquet
- split: latest
path:
- results_2023-09-13T15-43-56.959580.parquet
---
# Dataset Card for Evaluation run of PygmalionAI/mythalion-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PygmalionAI/mythalion-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PygmalionAI/mythalion-13b](https://huggingface.co/PygmalionAI/mythalion-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PygmalionAI__mythalion-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T15:43:56.959580](https://huggingface.co/datasets/open-llm-leaderboard/details_PygmalionAI__mythalion-13b/blob/main/results_2023-09-13T15-43-56.959580.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5668139361802185,
"acc_stderr": 0.03433655004935063,
"acc_norm": 0.5706940243762324,
"acc_norm_stderr": 0.03431449412200888,
"mc1": 0.3219094247246022,
"mc1_stderr": 0.016355567611960404,
"mc2": 0.46562168990109065,
"mc2_stderr": 0.015291610692060842
},
"harness|arc:challenge|25": {
"acc": 0.5767918088737202,
"acc_stderr": 0.014438036220848032,
"acc_norm": 0.6126279863481229,
"acc_norm_stderr": 0.014235872487909869
},
"harness|hellaswag|10": {
"acc": 0.644991037641904,
"acc_stderr": 0.004775380866948015,
"acc_norm": 0.8380800637323242,
"acc_norm_stderr": 0.0036762448867232646
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.0379401267469703,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.0379401267469703
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179328,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179328
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.024229965298425075,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.024229965298425075
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6387096774193548,
"acc_stderr": 0.02732754844795754,
"acc_norm": 0.6387096774193548,
"acc_norm_stderr": 0.02732754844795754
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.028697873971860688,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.028697873971860688
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.517948717948718,
"acc_stderr": 0.025334667080954925,
"acc_norm": 0.517948717948718,
"acc_norm_stderr": 0.025334667080954925
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7394495412844037,
"acc_stderr": 0.01881918203485007,
"acc_norm": 0.7394495412844037,
"acc_norm_stderr": 0.01881918203485007
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.033247089118091176,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.033247089118091176
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326467,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326467
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335435,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335435
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7713920817369093,
"acc_stderr": 0.015016884698539866,
"acc_norm": 0.7713920817369093,
"acc_norm_stderr": 0.015016884698539866
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.025816756791584194,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.025816756791584194
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4581005586592179,
"acc_stderr": 0.016663683295020527,
"acc_norm": 0.4581005586592179,
"acc_norm_stderr": 0.016663683295020527
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.027530078447110303,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.027530078447110303
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.639871382636656,
"acc_stderr": 0.027264297599804012,
"acc_norm": 0.639871382636656,
"acc_norm_stderr": 0.027264297599804012
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.026915003011380154,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.026915003011380154
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.0293922365846125,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.0293922365846125
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43089960886571055,
"acc_stderr": 0.012647695889547228,
"acc_norm": 0.43089960886571055,
"acc_norm_stderr": 0.012647695889547228
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03032024326500413,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03032024326500413
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.019886221037501862,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.019886221037501862
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7213930348258707,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.7213930348258707,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3219094247246022,
"mc1_stderr": 0.016355567611960404,
"mc2": 0.46562168990109065,
"mc2_stderr": 0.015291610692060842
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/illya_coral_tenseioujototensaireijounomahoukakumei | 2023-09-17T17:35:42.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Illya Coral
This is the dataset of Illya Coral, containing 117 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 117 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 262 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 117 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 117 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 117 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 117 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 117 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 262 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 262 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 262 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_PocketDoc__Dans-PersonalityEngine-30b | 2023-09-13T15:49:06.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PocketDoc/Dans-PersonalityEngine-30b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PocketDoc/Dans-PersonalityEngine-30b](https://huggingface.co/PocketDoc/Dans-PersonalityEngine-30b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PocketDoc__Dans-PersonalityEngine-30b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T15:47:49.138140](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-PersonalityEngine-30b/blob/main/results_2023-09-13T15-47-49.138140.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.591174119220395,\n\
\ \"acc_stderr\": 0.03393583674683373,\n \"acc_norm\": 0.5949180666490618,\n\
\ \"acc_norm_stderr\": 0.033913573731471684,\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.016238065069059605,\n \"mc2\": 0.46980891008021114,\n\
\ \"mc2_stderr\": 0.014753681771105764\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.613481228668942,\n \"acc_stderr\": 0.014230084761910471,\n\
\ \"acc_norm\": 0.6348122866894198,\n \"acc_norm_stderr\": 0.014070265519268802\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6440948018323043,\n\
\ \"acc_stderr\": 0.004778081784542405,\n \"acc_norm\": 0.8436566421031667,\n\
\ \"acc_norm_stderr\": 0.0036243831208234508\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5773584905660377,\n \"acc_stderr\": 0.03040233144576954,\n\
\ \"acc_norm\": 0.5773584905660377,\n \"acc_norm_stderr\": 0.03040233144576954\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.04062990784146667,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.04062990784146667\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n\
\ \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.5086705202312138,\n\
\ \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3544973544973545,\n \"acc_stderr\": 0.024636830602841997,\n \"\
acc_norm\": 0.3544973544973545,\n \"acc_norm_stderr\": 0.024636830602841997\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6935483870967742,\n\
\ \"acc_stderr\": 0.026226485652553883,\n \"acc_norm\": 0.6935483870967742,\n\
\ \"acc_norm_stderr\": 0.026226485652553883\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.034819048444388045,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.034819048444388045\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.035014387062967806,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.035014387062967806\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932026,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932026\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723886,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723886\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.025069094387296535,\n\
\ \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.025069094387296535\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.03196876989195778,\n \
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.03196876989195778\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7614678899082569,\n \"acc_stderr\": 0.018272575810231874,\n \"\
acc_norm\": 0.7614678899082569,\n \"acc_norm_stderr\": 0.018272575810231874\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653064,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653064\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.029102254389674082,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.029102254389674082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n\
\ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n\
\ \"acc_stderr\": 0.04524596007030048,\n \"acc_norm\": 0.6759259259259259,\n\
\ \"acc_norm_stderr\": 0.04524596007030048\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615622,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615622\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560413,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560413\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7752234993614304,\n\
\ \"acc_stderr\": 0.014927447101937153,\n \"acc_norm\": 0.7752234993614304,\n\
\ \"acc_norm_stderr\": 0.014927447101937153\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.025190181327608405,\n\
\ \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.025190181327608405\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4312849162011173,\n\
\ \"acc_stderr\": 0.016563829399047707,\n \"acc_norm\": 0.4312849162011173,\n\
\ \"acc_norm_stderr\": 0.016563829399047707\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02782610930728369,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02782610930728369\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.026664410886937627,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.026664410886937627\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824087,\n\
\ \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824087\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4621903520208605,\n\
\ \"acc_stderr\": 0.012733671880342506,\n \"acc_norm\": 0.4621903520208605,\n\
\ \"acc_norm_stderr\": 0.012733671880342506\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n\
\ \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6111111111111112,\n \"acc_stderr\": 0.01972205893961806,\n \
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.01972205893961806\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.031001209039894843,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.031001209039894843\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n\
\ \"acc_stderr\": 0.02899690969332891,\n \"acc_norm\": 0.7860696517412935,\n\
\ \"acc_norm_stderr\": 0.02899690969332891\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743252,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743252\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.016238065069059605,\n \"mc2\": 0.46980891008021114,\n\
\ \"mc2_stderr\": 0.014753681771105764\n }\n}\n```"
repo_url: https://huggingface.co/PocketDoc/Dans-PersonalityEngine-30b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|arc:challenge|25_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hellaswag|10_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T15-47-49.138140.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T15-47-49.138140.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T15-47-49.138140.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T15-47-49.138140.parquet'
- config_name: results
data_files:
- split: 2023_09_13T15_47_49.138140
path:
- results_2023-09-13T15-47-49.138140.parquet
- split: latest
path:
- results_2023-09-13T15-47-49.138140.parquet
---
# Dataset Card for Evaluation run of PocketDoc/Dans-PersonalityEngine-30b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PocketDoc/Dans-PersonalityEngine-30b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PocketDoc/Dans-PersonalityEngine-30b](https://huggingface.co/PocketDoc/Dans-PersonalityEngine-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PocketDoc__Dans-PersonalityEngine-30b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T15:47:49.138140](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-PersonalityEngine-30b/blob/main/results_2023-09-13T15-47-49.138140.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.591174119220395,
"acc_stderr": 0.03393583674683373,
"acc_norm": 0.5949180666490618,
"acc_norm_stderr": 0.033913573731471684,
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059605,
"mc2": 0.46980891008021114,
"mc2_stderr": 0.014753681771105764
},
"harness|arc:challenge|25": {
"acc": 0.613481228668942,
"acc_stderr": 0.014230084761910471,
"acc_norm": 0.6348122866894198,
"acc_norm_stderr": 0.014070265519268802
},
"harness|hellaswag|10": {
"acc": 0.6440948018323043,
"acc_stderr": 0.004778081784542405,
"acc_norm": 0.8436566421031667,
"acc_norm_stderr": 0.0036243831208234508
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.0387813988879761,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.0387813988879761
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5773584905660377,
"acc_stderr": 0.03040233144576954,
"acc_norm": 0.5773584905660377,
"acc_norm_stderr": 0.03040233144576954
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.04062990784146667,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.04062990784146667
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.03811890988940412,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.03811890988940412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3544973544973545,
"acc_stderr": 0.024636830602841997,
"acc_norm": 0.3544973544973545,
"acc_norm_stderr": 0.024636830602841997
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6935483870967742,
"acc_stderr": 0.026226485652553883,
"acc_norm": 0.6935483870967742,
"acc_norm_stderr": 0.026226485652553883
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.034819048444388045,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.034819048444388045
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.035014387062967806,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.035014387062967806
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932026,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932026
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723886,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723886
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5743589743589743,
"acc_stderr": 0.025069094387296535,
"acc_norm": 0.5743589743589743,
"acc_norm_stderr": 0.025069094387296535
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.03196876989195778,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.03196876989195778
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7614678899082569,
"acc_stderr": 0.018272575810231874,
"acc_norm": 0.7614678899082569,
"acc_norm_stderr": 0.018272575810231874
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.03372343271653064,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.03372343271653064
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.029102254389674082,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.029102254389674082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.04524596007030048,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.04524596007030048
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615622,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615622
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560413,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560413
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7752234993614304,
"acc_stderr": 0.014927447101937153,
"acc_norm": 0.7752234993614304,
"acc_norm_stderr": 0.014927447101937153
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.025190181327608405,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.025190181327608405
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4312849162011173,
"acc_stderr": 0.016563829399047707,
"acc_norm": 0.4312849162011173,
"acc_norm_stderr": 0.016563829399047707
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.02782610930728369,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.02782610930728369
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.026664410886937627,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.026664410886937627
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824087,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824087
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.02958345203628407,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.02958345203628407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4621903520208605,
"acc_stderr": 0.012733671880342506,
"acc_norm": 0.4621903520208605,
"acc_norm_stderr": 0.012733671880342506
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5992647058823529,
"acc_stderr": 0.029768263528933105,
"acc_norm": 0.5992647058823529,
"acc_norm_stderr": 0.029768263528933105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.01972205893961806,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.01972205893961806
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.031001209039894843,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.031001209039894843
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7860696517412935,
"acc_stderr": 0.02899690969332891,
"acc_norm": 0.7860696517412935,
"acc_norm_stderr": 0.02899690969332891
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.03061111655743252,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.03061111655743252
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059605,
"mc2": 0.46980891008021114,
"mc2_stderr": 0.014753681771105764
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
xlajitx/SD | 2023-09-13T15:50:00.000Z | [
"region:us"
] | xlajitx | null | null | null | 0 | 0 | Entry not found |
CyberHarem/tilty_claret_tenseioujototensaireijounomahoukakumei | 2023-09-17T17:35:44.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Tilty Claret
This is the dataset of Tilty Claret, containing 147 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 147 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 288 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 147 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 147 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 147 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 147 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 147 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 288 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 288 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 288 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
afern24/gtzan_all_preprocessed | 2023-09-13T16:10:31.000Z | [
"region:us"
] | afern24 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: label
dtype:
class_label:
names:
'0': blues
'1': classical
'2': country
'3': disco
'4': hiphop
'5': jazz
'6': metal
'7': pop
'8': reggae
'9': rock
- name: input_values
sequence: float32
- name: attention_mask
sequence: int32
splits:
- name: train
num_bytes: 3452159816
num_examples: 899
- name: test
num_bytes: 384000696
num_examples: 100
download_size: 1923103923
dataset_size: 3836160512
---
# Dataset Card for "gtzan_all_preprocessed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/lainie_cyan_tenseioujototensaireijounomahoukakumei | 2023-09-17T17:35:46.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Lainie Cyan
This is the dataset of Lainie Cyan, containing 102 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 102 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 194 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 102 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 102 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 102 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 102 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 102 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 194 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 194 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 194 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_royallab__Pygmalion-2-13b-SuperCOT | 2023-09-13T16:16:11.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of royallab/Pygmalion-2-13b-SuperCOT
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [royallab/Pygmalion-2-13b-SuperCOT](https://huggingface.co/royallab/Pygmalion-2-13b-SuperCOT)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_royallab__Pygmalion-2-13b-SuperCOT\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T16:14:52.647563](https://huggingface.co/datasets/open-llm-leaderboard/details_royallab__Pygmalion-2-13b-SuperCOT/blob/main/results_2023-09-13T16-14-52.647563.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5512933540299775,\n\
\ \"acc_stderr\": 0.03456683575991647,\n \"acc_norm\": 0.5553123422726481,\n\
\ \"acc_norm_stderr\": 0.03454389718874387,\n \"mc1\": 0.38310893512851896,\n\
\ \"mc1_stderr\": 0.01701846167938986,\n \"mc2\": 0.531441871245389,\n\
\ \"mc2_stderr\": 0.015443380559262922\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5989761092150171,\n \"acc_stderr\": 0.014322255790719869,\n\
\ \"acc_norm\": 0.6322525597269625,\n \"acc_norm_stderr\": 0.014090995618168484\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6329416450906195,\n\
\ \"acc_stderr\": 0.004810175357870934,\n \"acc_norm\": 0.8367855008962358,\n\
\ \"acc_norm_stderr\": 0.0036880598312390277\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490436,\n\
\ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490436\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.030052580579557845,\n\
\ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.030052580579557845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n\
\ \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.5694444444444444,\n\
\ \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273958,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273958\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.335978835978836,\n \"acc_stderr\": 0.024326310529149138,\n \"\
acc_norm\": 0.335978835978836,\n \"acc_norm_stderr\": 0.024326310529149138\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6258064516129033,\n\
\ \"acc_stderr\": 0.027528904299845704,\n \"acc_norm\": 0.6258064516129033,\n\
\ \"acc_norm_stderr\": 0.027528904299845704\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3891625615763547,\n \"acc_stderr\": 0.034304624161038716,\n\
\ \"acc_norm\": 0.3891625615763547,\n \"acc_norm_stderr\": 0.034304624161038716\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6767676767676768,\n \"acc_stderr\": 0.03332299921070643,\n \"\
acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.03332299921070643\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.028408953626245265,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.028408953626245265\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5051282051282051,\n \"acc_stderr\": 0.025349672906838653,\n\
\ \"acc_norm\": 0.5051282051282051,\n \"acc_norm_stderr\": 0.025349672906838653\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5210084033613446,\n \"acc_stderr\": 0.03244980849990029,\n \
\ \"acc_norm\": 0.5210084033613446,\n \"acc_norm_stderr\": 0.03244980849990029\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7247706422018348,\n \"acc_stderr\": 0.0191490937431552,\n \"acc_norm\"\
: 0.7247706422018348,\n \"acc_norm_stderr\": 0.0191490937431552\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3287037037037037,\n\
\ \"acc_stderr\": 0.032036140846700596,\n \"acc_norm\": 0.3287037037037037,\n\
\ \"acc_norm_stderr\": 0.032036140846700596\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604246,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.03210062154134986,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.03210062154134986\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516304,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516304\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.036230899157241474,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.036230899157241474\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503948,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503948\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335445,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335445\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7496807151979565,\n\
\ \"acc_stderr\": 0.015491088951494569,\n \"acc_norm\": 0.7496807151979565,\n\
\ \"acc_norm_stderr\": 0.015491088951494569\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37988826815642457,\n\
\ \"acc_stderr\": 0.016232826818678502,\n \"acc_norm\": 0.37988826815642457,\n\
\ \"acc_norm_stderr\": 0.016232826818678502\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424516,\n\
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424516\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n\
\ \"acc_stderr\": 0.02760468902858199,\n \"acc_norm\": 0.617363344051447,\n\
\ \"acc_norm_stderr\": 0.02760468902858199\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271146,\n\
\ \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271146\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370604,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370604\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3741851368970013,\n\
\ \"acc_stderr\": 0.01235933561817206,\n \"acc_norm\": 0.3741851368970013,\n\
\ \"acc_norm_stderr\": 0.01235933561817206\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.49264705882352944,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.49264705882352944,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5490196078431373,\n \"acc_stderr\": 0.020130388312904528,\n \
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.020130388312904528\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.030965903123573037,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.030965903123573037\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38310893512851896,\n\
\ \"mc1_stderr\": 0.01701846167938986,\n \"mc2\": 0.531441871245389,\n\
\ \"mc2_stderr\": 0.015443380559262922\n }\n}\n```"
repo_url: https://huggingface.co/royallab/Pygmalion-2-13b-SuperCOT
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|arc:challenge|25_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hellaswag|10_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T16-14-52.647563.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T16-14-52.647563.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T16-14-52.647563.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T16-14-52.647563.parquet'
- config_name: results
data_files:
- split: 2023_09_13T16_14_52.647563
path:
- results_2023-09-13T16-14-52.647563.parquet
- split: latest
path:
- results_2023-09-13T16-14-52.647563.parquet
---
# Dataset Card for Evaluation run of royallab/Pygmalion-2-13b-SuperCOT
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/royallab/Pygmalion-2-13b-SuperCOT
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [royallab/Pygmalion-2-13b-SuperCOT](https://huggingface.co/royallab/Pygmalion-2-13b-SuperCOT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_royallab__Pygmalion-2-13b-SuperCOT",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T16:14:52.647563](https://huggingface.co/datasets/open-llm-leaderboard/details_royallab__Pygmalion-2-13b-SuperCOT/blob/main/results_2023-09-13T16-14-52.647563.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5512933540299775,
"acc_stderr": 0.03456683575991647,
"acc_norm": 0.5553123422726481,
"acc_norm_stderr": 0.03454389718874387,
"mc1": 0.38310893512851896,
"mc1_stderr": 0.01701846167938986,
"mc2": 0.531441871245389,
"mc2_stderr": 0.015443380559262922
},
"harness|arc:challenge|25": {
"acc": 0.5989761092150171,
"acc_stderr": 0.014322255790719869,
"acc_norm": 0.6322525597269625,
"acc_norm_stderr": 0.014090995618168484
},
"harness|hellaswag|10": {
"acc": 0.6329416450906195,
"acc_stderr": 0.004810175357870934,
"acc_norm": 0.8367855008962358,
"acc_norm_stderr": 0.0036880598312390277
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.030052580579557845,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.030052580579557845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111503,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111503
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273958,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273958
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.335978835978836,
"acc_stderr": 0.024326310529149138,
"acc_norm": 0.335978835978836,
"acc_norm_stderr": 0.024326310529149138
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6258064516129033,
"acc_stderr": 0.027528904299845704,
"acc_norm": 0.6258064516129033,
"acc_norm_stderr": 0.027528904299845704
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3891625615763547,
"acc_stderr": 0.034304624161038716,
"acc_norm": 0.3891625615763547,
"acc_norm_stderr": 0.034304624161038716
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.03332299921070643,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.03332299921070643
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.028408953626245265,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.028408953626245265
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5051282051282051,
"acc_stderr": 0.025349672906838653,
"acc_norm": 0.5051282051282051,
"acc_norm_stderr": 0.025349672906838653
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5210084033613446,
"acc_stderr": 0.03244980849990029,
"acc_norm": 0.5210084033613446,
"acc_norm_stderr": 0.03244980849990029
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7247706422018348,
"acc_stderr": 0.0191490937431552,
"acc_norm": 0.7247706422018348,
"acc_norm_stderr": 0.0191490937431552
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3287037037037037,
"acc_stderr": 0.032036140846700596,
"acc_norm": 0.3287037037037037,
"acc_norm_stderr": 0.032036140846700596
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.03210062154134986,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.03210062154134986
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516304,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516304
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503948,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503948
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335445,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335445
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7496807151979565,
"acc_stderr": 0.015491088951494569,
"acc_norm": 0.7496807151979565,
"acc_norm_stderr": 0.015491088951494569
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.02607431485165708,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.02607431485165708
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37988826815642457,
"acc_stderr": 0.016232826818678502,
"acc_norm": 0.37988826815642457,
"acc_norm_stderr": 0.016232826818678502
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.027956046165424516,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.027956046165424516
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.617363344051447,
"acc_stderr": 0.02760468902858199,
"acc_norm": 0.617363344051447,
"acc_norm_stderr": 0.02760468902858199
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6080246913580247,
"acc_stderr": 0.027163686038271146,
"acc_norm": 0.6080246913580247,
"acc_norm_stderr": 0.027163686038271146
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370604,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370604
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3741851368970013,
"acc_stderr": 0.01235933561817206,
"acc_norm": 0.3741851368970013,
"acc_norm_stderr": 0.01235933561817206
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.49264705882352944,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.49264705882352944,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.020130388312904528,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.020130388312904528
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.030965903123573037,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.030965903123573037
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38310893512851896,
"mc1_stderr": 0.01701846167938986,
"mc2": 0.531441871245389,
"mc2_stderr": 0.015443380559262922
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_tiiuae__falcon-rw-1b | 2023-09-13T16:18:01.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of tiiuae/falcon-rw-1b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [tiiuae/falcon-rw-1b](https://huggingface.co/tiiuae/falcon-rw-1b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tiiuae__falcon-rw-1b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T16:16:44.792936](https://huggingface.co/datasets/open-llm-leaderboard/details_tiiuae__falcon-rw-1b/blob/main/results_2023-09-13T16-16-44.792936.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2576958188312759,\n\
\ \"acc_stderr\": 0.03170992377612896,\n \"acc_norm\": 0.26095075744871704,\n\
\ \"acc_norm_stderr\": 0.031711112483045936,\n \"mc1\": 0.2178702570379437,\n\
\ \"mc1_stderr\": 0.014450846714123899,\n \"mc2\": 0.3595559898003175,\n\
\ \"mc2_stderr\": 0.013676223639446223\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.32593856655290104,\n \"acc_stderr\": 0.013697432466693244,\n\
\ \"acc_norm\": 0.3506825938566553,\n \"acc_norm_stderr\": 0.013944635930726096\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4683330013941446,\n\
\ \"acc_stderr\": 0.0049797638621349866,\n \"acc_norm\": 0.6356303525194185,\n\
\ \"acc_norm_stderr\": 0.004802694106203663\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.03749850709174021,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.03749850709174021\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2565789473684211,\n \"acc_stderr\": 0.035541803680256896,\n\
\ \"acc_norm\": 0.2565789473684211,\n \"acc_norm_stderr\": 0.035541803680256896\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149352,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149352\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2680851063829787,\n \"acc_stderr\": 0.028957342788342343,\n\
\ \"acc_norm\": 0.2680851063829787,\n \"acc_norm_stderr\": 0.028957342788342343\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.03892431106518752,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.03892431106518752\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2751322751322751,\n \"acc_stderr\": 0.02300008685906864,\n \"\
acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.02300008685906864\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.036196045241242515,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.036196045241242515\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.18387096774193548,\n\
\ \"acc_stderr\": 0.022037217340267836,\n \"acc_norm\": 0.18387096774193548,\n\
\ \"acc_norm_stderr\": 0.022037217340267836\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733545,\n\
\ \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733545\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\"\
: 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.16161616161616163,\n \"acc_stderr\": 0.026225919863629293,\n \"\
acc_norm\": 0.16161616161616163,\n \"acc_norm_stderr\": 0.026225919863629293\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.17098445595854922,\n \"acc_stderr\": 0.02717121368316453,\n\
\ \"acc_norm\": 0.17098445595854922,\n \"acc_norm_stderr\": 0.02717121368316453\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.21025641025641026,\n \"acc_stderr\": 0.020660597485026938,\n\
\ \"acc_norm\": 0.21025641025641026,\n \"acc_norm_stderr\": 0.020660597485026938\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23703703703703705,\n \"acc_stderr\": 0.02592887613276612,\n \
\ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.02592887613276612\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22201834862385322,\n \"acc_stderr\": 0.017818849564796645,\n \"\
acc_norm\": 0.22201834862385322,\n \"acc_norm_stderr\": 0.017818849564796645\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.18981481481481483,\n \"acc_stderr\": 0.026744714834691936,\n \"\
acc_norm\": 0.18981481481481483,\n \"acc_norm_stderr\": 0.026744714834691936\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2696078431372549,\n \"acc_stderr\": 0.03114557065948678,\n \"\
acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.03114557065948678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.273542600896861,\n\
\ \"acc_stderr\": 0.029918586707798824,\n \"acc_norm\": 0.273542600896861,\n\
\ \"acc_norm_stderr\": 0.029918586707798824\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.03226219377286774,\n\
\ \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.03226219377286774\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578729,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578729\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.03989139859531772,\n\
\ \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.03989139859531772\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n\
\ \"acc_stderr\": 0.029614323690456648,\n \"acc_norm\": 0.2863247863247863,\n\
\ \"acc_norm_stderr\": 0.029614323690456648\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2681992337164751,\n\
\ \"acc_stderr\": 0.015842430835269438,\n \"acc_norm\": 0.2681992337164751,\n\
\ \"acc_norm_stderr\": 0.015842430835269438\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2745664739884393,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.2745664739884393,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808836,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808836\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.02463004897982477,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.02463004897982477\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2765273311897106,\n\
\ \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.2765273311897106,\n\
\ \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2716049382716049,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.2716049382716049,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843014,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843014\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2653194263363755,\n\
\ \"acc_stderr\": 0.011276198843958876,\n \"acc_norm\": 0.2653194263363755,\n\
\ \"acc_norm_stderr\": 0.011276198843958876\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.15808823529411764,\n \"acc_stderr\": 0.02216146260806852,\n\
\ \"acc_norm\": 0.15808823529411764,\n \"acc_norm_stderr\": 0.02216146260806852\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.238562091503268,\n \"acc_stderr\": 0.017242385828779568,\n \
\ \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.017242385828779568\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n\
\ \"acc_stderr\": 0.04069306319721378,\n \"acc_norm\": 0.23636363636363636,\n\
\ \"acc_norm_stderr\": 0.04069306319721378\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.22857142857142856,\n \"acc_stderr\": 0.026882144922307748,\n\
\ \"acc_norm\": 0.22857142857142856,\n \"acc_norm_stderr\": 0.026882144922307748\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.25870646766169153,\n\
\ \"acc_stderr\": 0.030965903123573012,\n \"acc_norm\": 0.25870646766169153,\n\
\ \"acc_norm_stderr\": 0.030965903123573012\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2891566265060241,\n\
\ \"acc_stderr\": 0.03529486801511115,\n \"acc_norm\": 0.2891566265060241,\n\
\ \"acc_norm_stderr\": 0.03529486801511115\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.03488647713457922,\n\
\ \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.03488647713457922\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2178702570379437,\n\
\ \"mc1_stderr\": 0.014450846714123899,\n \"mc2\": 0.3595559898003175,\n\
\ \"mc2_stderr\": 0.013676223639446223\n }\n}\n```"
repo_url: https://huggingface.co/tiiuae/falcon-rw-1b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|arc:challenge|25_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hellaswag|10_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T16-16-44.792936.parquet'
- config_name: results
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- results_2023-09-13T16-16-44.792936.parquet
- split: latest
path:
- results_2023-09-13T16-16-44.792936.parquet
---
# Dataset Card for Evaluation run of tiiuae/falcon-rw-1b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/tiiuae/falcon-rw-1b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [tiiuae/falcon-rw-1b](https://huggingface.co/tiiuae/falcon-rw-1b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_tiiuae__falcon-rw-1b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T16:16:44.792936](https://huggingface.co/datasets/open-llm-leaderboard/details_tiiuae__falcon-rw-1b/blob/main/results_2023-09-13T16-16-44.792936.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2576958188312759,
"acc_stderr": 0.03170992377612896,
"acc_norm": 0.26095075744871704,
"acc_norm_stderr": 0.031711112483045936,
"mc1": 0.2178702570379437,
"mc1_stderr": 0.014450846714123899,
"mc2": 0.3595559898003175,
"mc2_stderr": 0.013676223639446223
},
"harness|arc:challenge|25": {
"acc": 0.32593856655290104,
"acc_stderr": 0.013697432466693244,
"acc_norm": 0.3506825938566553,
"acc_norm_stderr": 0.013944635930726096
},
"harness|hellaswag|10": {
"acc": 0.4683330013941446,
"acc_stderr": 0.0049797638621349866,
"acc_norm": 0.6356303525194185,
"acc_norm_stderr": 0.004802694106203663
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.03749850709174021,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.03749850709174021
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2565789473684211,
"acc_stderr": 0.035541803680256896,
"acc_norm": 0.2565789473684211,
"acc_norm_stderr": 0.035541803680256896
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149352,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149352
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2680851063829787,
"acc_stderr": 0.028957342788342343,
"acc_norm": 0.2680851063829787,
"acc_norm_stderr": 0.028957342788342343
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518752,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518752
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.03695183311650232,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.03695183311650232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.02300008685906864,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.02300008685906864
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.036196045241242515,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.036196045241242515
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.18387096774193548,
"acc_stderr": 0.022037217340267836,
"acc_norm": 0.18387096774193548,
"acc_norm_stderr": 0.022037217340267836
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.029896114291733545,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.029896114291733545
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.16161616161616163,
"acc_stderr": 0.026225919863629293,
"acc_norm": 0.16161616161616163,
"acc_norm_stderr": 0.026225919863629293
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.17098445595854922,
"acc_stderr": 0.02717121368316453,
"acc_norm": 0.17098445595854922,
"acc_norm_stderr": 0.02717121368316453
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21025641025641026,
"acc_stderr": 0.020660597485026938,
"acc_norm": 0.21025641025641026,
"acc_norm_stderr": 0.020660597485026938
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.02592887613276612,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.02592887613276612
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22201834862385322,
"acc_stderr": 0.017818849564796645,
"acc_norm": 0.22201834862385322,
"acc_norm_stderr": 0.017818849564796645
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18981481481481483,
"acc_stderr": 0.026744714834691936,
"acc_norm": 0.18981481481481483,
"acc_norm_stderr": 0.026744714834691936
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.03114557065948678,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.03114557065948678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.273542600896861,
"acc_stderr": 0.029918586707798824,
"acc_norm": 0.273542600896861,
"acc_norm_stderr": 0.029918586707798824
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578729,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578729
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.03989139859531772,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.03989139859531772
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2863247863247863,
"acc_stderr": 0.029614323690456648,
"acc_norm": 0.2863247863247863,
"acc_norm_stderr": 0.029614323690456648
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2681992337164751,
"acc_stderr": 0.015842430835269438,
"acc_norm": 0.2681992337164751,
"acc_norm_stderr": 0.015842430835269438
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2745664739884393,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.2745664739884393,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808836,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808836
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.02463004897982477,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.02463004897982477
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2765273311897106,
"acc_stderr": 0.02540383297817961,
"acc_norm": 0.2765273311897106,
"acc_norm_stderr": 0.02540383297817961
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2716049382716049,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.2716049382716049,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843014,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843014
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2653194263363755,
"acc_stderr": 0.011276198843958876,
"acc_norm": 0.2653194263363755,
"acc_norm_stderr": 0.011276198843958876
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.15808823529411764,
"acc_stderr": 0.02216146260806852,
"acc_norm": 0.15808823529411764,
"acc_norm_stderr": 0.02216146260806852
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.238562091503268,
"acc_stderr": 0.017242385828779568,
"acc_norm": 0.238562091503268,
"acc_norm_stderr": 0.017242385828779568
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.04069306319721378,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.04069306319721378
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22857142857142856,
"acc_stderr": 0.026882144922307748,
"acc_norm": 0.22857142857142856,
"acc_norm_stderr": 0.026882144922307748
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.25870646766169153,
"acc_stderr": 0.030965903123573012,
"acc_norm": 0.25870646766169153,
"acc_norm_stderr": 0.030965903123573012
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2891566265060241,
"acc_stderr": 0.03529486801511115,
"acc_norm": 0.2891566265060241,
"acc_norm_stderr": 0.03529486801511115
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.03488647713457922,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.03488647713457922
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2178702570379437,
"mc1_stderr": 0.014450846714123899,
"mc2": 0.3595559898003175,
"mc2_stderr": 0.013676223639446223
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
uran66/animals | 2023-09-13T16:44:00.000Z | [
"license:unknown",
"region:us"
] | uran66 | null | null | null | 0 | 0 | ---
license: unknown
---
|
CyberHarem/yukino_yukinoshita_yahariorenoseishunlovecomewamachigatteiru | 2023-09-17T17:35:48.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Yukino Yukinoshita
This is the dataset of Yukino Yukinoshita, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 680 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 680 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 680 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 680 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
dungvc2/khanhdm | 2023-09-13T16:34:32.000Z | [
"region:us"
] | dungvc2 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_sauce1337__AppleSauce-L2-13b | 2023-09-13T16:34:16.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of sauce1337/AppleSauce-L2-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [sauce1337/AppleSauce-L2-13b](https://huggingface.co/sauce1337/AppleSauce-L2-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sauce1337__AppleSauce-L2-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T16:32:51.732119](https://huggingface.co/datasets/open-llm-leaderboard/details_sauce1337__AppleSauce-L2-13b/blob/main/results_2023-09-13T16-32-51.732119.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.572255468698196,\n\
\ \"acc_stderr\": 0.0342210373928205,\n \"acc_norm\": 0.5758241500462178,\n\
\ \"acc_norm_stderr\": 0.03420084086804014,\n \"mc1\": 0.3390452876376989,\n\
\ \"mc1_stderr\": 0.016571797910626608,\n \"mc2\": 0.47814234702313524,\n\
\ \"mc2_stderr\": 0.015461431243599005\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5930034129692833,\n \"acc_stderr\": 0.014356399418009124,\n\
\ \"acc_norm\": 0.6100682593856656,\n \"acc_norm_stderr\": 0.014252959848892889\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6426010754829715,\n\
\ \"acc_stderr\": 0.004782542754102087,\n \"acc_norm\": 0.8360884285998805,\n\
\ \"acc_norm_stderr\": 0.003694387361177659\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874142,\n\
\ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874142\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.029890609686286648,\n\
\ \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.029890609686286648\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.037940126746970296,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.037940126746970296\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.02397386199899207,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.02397386199899207\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n\
\ \"acc_stderr\": 0.026923446059302844,\n \"acc_norm\": 0.6612903225806451,\n\
\ \"acc_norm_stderr\": 0.026923446059302844\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7070707070707071,\n \"acc_stderr\": 0.03242497958178816,\n \"\
acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.03242497958178816\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.02649905770139744,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.02649905770139744\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5461538461538461,\n \"acc_stderr\": 0.025242770987126177,\n\
\ \"acc_norm\": 0.5461538461538461,\n \"acc_norm_stderr\": 0.025242770987126177\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228412,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228412\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.032219436365661956,\n\
\ \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.032219436365661956\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7596330275229358,\n\
\ \"acc_stderr\": 0.01832060732096407,\n \"acc_norm\": 0.7596330275229358,\n\
\ \"acc_norm_stderr\": 0.01832060732096407\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.033509916046960415,\n\
\ \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.033509916046960415\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650741,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650741\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.045723723587374296,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.045723723587374296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n\
\ \"acc_stderr\": 0.026655699653922737,\n \"acc_norm\": 0.7905982905982906,\n\
\ \"acc_norm_stderr\": 0.026655699653922737\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7573435504469987,\n\
\ \"acc_stderr\": 0.015329888940899846,\n \"acc_norm\": 0.7573435504469987,\n\
\ \"acc_norm_stderr\": 0.015329888940899846\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.02572280220089581,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.02572280220089581\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4324022346368715,\n\
\ \"acc_stderr\": 0.01656897123354861,\n \"acc_norm\": 0.4324022346368715,\n\
\ \"acc_norm_stderr\": 0.01656897123354861\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.02763417668960266,\n\
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.02763417668960266\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.026517597724465013,\n\
\ \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.026517597724465013\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.02949482760014437,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.02949482760014437\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4282920469361147,\n\
\ \"acc_stderr\": 0.012638223880313167,\n \"acc_norm\": 0.4282920469361147,\n\
\ \"acc_norm_stderr\": 0.012638223880313167\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5808823529411765,\n \"acc_stderr\": 0.029972807170464626,\n\
\ \"acc_norm\": 0.5808823529411765,\n \"acc_norm_stderr\": 0.029972807170464626\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5784313725490197,\n \"acc_stderr\": 0.019977422600227477,\n \
\ \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.019977422600227477\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.030862144921087565,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.030862144921087565\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\
\ \"acc_stderr\": 0.03036049015401464,\n \"acc_norm\": 0.7562189054726368,\n\
\ \"acc_norm_stderr\": 0.03036049015401464\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.0330140594698725,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.0330140594698725\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3390452876376989,\n\
\ \"mc1_stderr\": 0.016571797910626608,\n \"mc2\": 0.47814234702313524,\n\
\ \"mc2_stderr\": 0.015461431243599005\n }\n}\n```"
repo_url: https://huggingface.co/sauce1337/AppleSauce-L2-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|arc:challenge|25_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hellaswag|10_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T16-32-51.732119.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T16-32-51.732119.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T16-32-51.732119.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T16-32-51.732119.parquet'
- config_name: results
data_files:
- split: 2023_09_13T16_32_51.732119
path:
- results_2023-09-13T16-32-51.732119.parquet
- split: latest
path:
- results_2023-09-13T16-32-51.732119.parquet
---
# Dataset Card for Evaluation run of sauce1337/AppleSauce-L2-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/sauce1337/AppleSauce-L2-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [sauce1337/AppleSauce-L2-13b](https://huggingface.co/sauce1337/AppleSauce-L2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sauce1337__AppleSauce-L2-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T16:32:51.732119](https://huggingface.co/datasets/open-llm-leaderboard/details_sauce1337__AppleSauce-L2-13b/blob/main/results_2023-09-13T16-32-51.732119.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.572255468698196,
"acc_stderr": 0.0342210373928205,
"acc_norm": 0.5758241500462178,
"acc_norm_stderr": 0.03420084086804014,
"mc1": 0.3390452876376989,
"mc1_stderr": 0.016571797910626608,
"mc2": 0.47814234702313524,
"mc2_stderr": 0.015461431243599005
},
"harness|arc:challenge|25": {
"acc": 0.5930034129692833,
"acc_stderr": 0.014356399418009124,
"acc_norm": 0.6100682593856656,
"acc_norm_stderr": 0.014252959848892889
},
"harness|hellaswag|10": {
"acc": 0.6426010754829715,
"acc_stderr": 0.004782542754102087,
"acc_norm": 0.8360884285998805,
"acc_norm_stderr": 0.003694387361177659
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6188679245283019,
"acc_stderr": 0.029890609686286648,
"acc_norm": 0.6188679245283019,
"acc_norm_stderr": 0.029890609686286648
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.037940126746970296,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.037940126746970296
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179328,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179328
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.02397386199899207,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.02397386199899207
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6612903225806451,
"acc_stderr": 0.026923446059302844,
"acc_norm": 0.6612903225806451,
"acc_norm_stderr": 0.026923446059302844
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.03242497958178816,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.03242497958178816
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.02649905770139744,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.02649905770139744
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5461538461538461,
"acc_stderr": 0.025242770987126177,
"acc_norm": 0.5461538461538461,
"acc_norm_stderr": 0.025242770987126177
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228412,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228412
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.032219436365661956,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.032219436365661956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7596330275229358,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.7596330275229358,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.033509916046960415,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.033509916046960415
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650741,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650741
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724146,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724146
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.045723723587374296,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.045723723587374296
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.026655699653922737,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.026655699653922737
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7573435504469987,
"acc_stderr": 0.015329888940899846,
"acc_norm": 0.7573435504469987,
"acc_norm_stderr": 0.015329888940899846
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.02572280220089581,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.02572280220089581
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4324022346368715,
"acc_stderr": 0.01656897123354861,
"acc_norm": 0.4324022346368715,
"acc_norm_stderr": 0.01656897123354861
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.02763417668960266,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.02763417668960266
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.026517597724465013,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.026517597724465013
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.02949482760014437,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.02949482760014437
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4282920469361147,
"acc_stderr": 0.012638223880313167,
"acc_norm": 0.4282920469361147,
"acc_norm_stderr": 0.012638223880313167
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5808823529411765,
"acc_stderr": 0.029972807170464626,
"acc_norm": 0.5808823529411765,
"acc_norm_stderr": 0.029972807170464626
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.019977422600227477,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.019977422600227477
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.030862144921087565,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.030862144921087565
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.03036049015401464,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.03036049015401464
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.0330140594698725,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.0330140594698725
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3390452876376989,
"mc1_stderr": 0.016571797910626608,
"mc2": 0.47814234702313524,
"mc2_stderr": 0.015461431243599005
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
shawarmas/actualmodernmesfia | 2023-09-13T16:38:12.000Z | [
"region:us"
] | shawarmas | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_nicholasKluge__Aira-2-355M | 2023-09-13T16:43:58.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of nicholasKluge/Aira-2-355M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nicholasKluge/Aira-2-355M](https://huggingface.co/nicholasKluge/Aira-2-355M)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nicholasKluge__Aira-2-355M\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T16:42:47.066460](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-2-355M/blob/main/results_2023-09-13T16-42-47.066460.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2731095353137795,\n\
\ \"acc_stderr\": 0.0320570767806455,\n \"acc_norm\": 0.2745852370598572,\n\
\ \"acc_norm_stderr\": 0.03206776464118219,\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871122,\n \"mc2\": 0.3852958717643368,\n\
\ \"mc2_stderr\": 0.014551409809539893\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.24658703071672355,\n \"acc_stderr\": 0.012595726268790127,\n\
\ \"acc_norm\": 0.27559726962457337,\n \"acc_norm_stderr\": 0.013057169655761838\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3311093407687712,\n\
\ \"acc_stderr\": 0.004696505101217406,\n \"acc_norm\": 0.3891655048795061,\n\
\ \"acc_norm_stderr\": 0.004865645485910432\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.037857144650666544,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.037857144650666544\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.26973684210526316,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.26973684210526316,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.19,\n\
\ \"acc_stderr\": 0.03942772444036622,\n \"acc_norm\": 0.19,\n \
\ \"acc_norm_stderr\": 0.03942772444036622\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3283018867924528,\n \"acc_stderr\": 0.028901593612411784,\n\
\ \"acc_norm\": 0.3283018867924528,\n \"acc_norm_stderr\": 0.028901593612411784\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.36,\n\
\ \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.28901734104046245,\n\
\ \"acc_stderr\": 0.034564257450869995,\n \"acc_norm\": 0.28901734104046245,\n\
\ \"acc_norm_stderr\": 0.034564257450869995\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.028659179374292326,\n\
\ \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.028659179374292326\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309994,\n\
\ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309994\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2698412698412698,\n \"acc_stderr\": 0.022860838309232072,\n \"\
acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.022860838309232072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.24838709677419354,\n \"acc_stderr\": 0.02458002892148101,\n \"\
acc_norm\": 0.24838709677419354,\n \"acc_norm_stderr\": 0.02458002892148101\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970186,\n \"\
acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970186\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139404,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139404\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3434343434343434,\n \"acc_stderr\": 0.033832012232444426,\n \"\
acc_norm\": 0.3434343434343434,\n \"acc_norm_stderr\": 0.033832012232444426\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.35233160621761656,\n \"acc_stderr\": 0.03447478286414359,\n\
\ \"acc_norm\": 0.35233160621761656,\n \"acc_norm_stderr\": 0.03447478286414359\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.36153846153846153,\n \"acc_stderr\": 0.024359581465396987,\n\
\ \"acc_norm\": 0.36153846153846153,\n \"acc_norm_stderr\": 0.024359581465396987\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275805,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275805\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.33613445378151263,\n \"acc_stderr\": 0.03068473711513536,\n\
\ \"acc_norm\": 0.33613445378151263,\n \"acc_norm_stderr\": 0.03068473711513536\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255168,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255168\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3467889908256881,\n \"acc_stderr\": 0.020406097104093027,\n \"\
acc_norm\": 0.3467889908256881,\n \"acc_norm_stderr\": 0.020406097104093027\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4398148148148148,\n \"acc_stderr\": 0.0338517797604481,\n \"acc_norm\"\
: 0.4398148148148148,\n \"acc_norm_stderr\": 0.0338517797604481\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n\
\ \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.2549019607843137,\n\
\ \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.20675105485232068,\n \"acc_stderr\": 0.026361651668389104,\n\
\ \"acc_norm\": 0.20675105485232068,\n \"acc_norm_stderr\": 0.026361651668389104\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.14349775784753363,\n\
\ \"acc_stderr\": 0.02352937126961819,\n \"acc_norm\": 0.14349775784753363,\n\
\ \"acc_norm_stderr\": 0.02352937126961819\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.14049586776859505,\n \"acc_stderr\": 0.03172233426002161,\n \"\
acc_norm\": 0.14049586776859505,\n \"acc_norm_stderr\": 0.03172233426002161\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.03226219377286774,\n\
\ \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.03226219377286774\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16964285714285715,\n\
\ \"acc_stderr\": 0.0356236785009539,\n \"acc_norm\": 0.16964285714285715,\n\
\ \"acc_norm_stderr\": 0.0356236785009539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n\
\ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23931623931623933,\n\
\ \"acc_stderr\": 0.027951826808924333,\n \"acc_norm\": 0.23931623931623933,\n\
\ \"acc_norm_stderr\": 0.027951826808924333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2260536398467433,\n\
\ \"acc_stderr\": 0.01495745850433583,\n \"acc_norm\": 0.2260536398467433,\n\
\ \"acc_norm_stderr\": 0.01495745850433583\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23410404624277456,\n \"acc_stderr\": 0.022797110278071138,\n\
\ \"acc_norm\": 0.23410404624277456,\n \"acc_norm_stderr\": 0.022797110278071138\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.014355911964767864,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.014355911964767864\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.27124183006535946,\n \"acc_stderr\": 0.025457756696667874,\n\
\ \"acc_norm\": 0.27124183006535946,\n \"acc_norm_stderr\": 0.025457756696667874\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2508038585209003,\n\
\ \"acc_stderr\": 0.024619771956697165,\n \"acc_norm\": 0.2508038585209003,\n\
\ \"acc_norm_stderr\": 0.024619771956697165\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22839506172839505,\n \"acc_stderr\": 0.023358211840626267,\n\
\ \"acc_norm\": 0.22839506172839505,\n \"acc_norm_stderr\": 0.023358211840626267\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24468085106382978,\n \"acc_stderr\": 0.025645553622266722,\n \
\ \"acc_norm\": 0.24468085106382978,\n \"acc_norm_stderr\": 0.025645553622266722\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2405475880052151,\n\
\ \"acc_stderr\": 0.01091640673547895,\n \"acc_norm\": 0.2405475880052151,\n\
\ \"acc_norm_stderr\": 0.01091640673547895\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4522058823529412,\n \"acc_stderr\": 0.030233758551596452,\n\
\ \"acc_norm\": 0.4522058823529412,\n \"acc_norm_stderr\": 0.030233758551596452\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24836601307189543,\n \"acc_stderr\": 0.017479487001364764,\n \
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.017479487001364764\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.03831305140884603,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.03831305140884603\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.1927710843373494,\n\
\ \"acc_stderr\": 0.030709824050565274,\n \"acc_norm\": 0.1927710843373494,\n\
\ \"acc_norm_stderr\": 0.030709824050565274\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.23391812865497075,\n \"acc_stderr\": 0.03246721765117825,\n\
\ \"acc_norm\": 0.23391812865497075,\n \"acc_norm_stderr\": 0.03246721765117825\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871122,\n \"mc2\": 0.3852958717643368,\n\
\ \"mc2_stderr\": 0.014551409809539893\n }\n}\n```"
repo_url: https://huggingface.co/nicholasKluge/Aira-2-355M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|arc:challenge|25_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hellaswag|10_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T16-42-47.066460.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T16-42-47.066460.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T16-42-47.066460.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T16-42-47.066460.parquet'
- config_name: results
data_files:
- split: 2023_09_13T16_42_47.066460
path:
- results_2023-09-13T16-42-47.066460.parquet
- split: latest
path:
- results_2023-09-13T16-42-47.066460.parquet
---
# Dataset Card for Evaluation run of nicholasKluge/Aira-2-355M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/nicholasKluge/Aira-2-355M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [nicholasKluge/Aira-2-355M](https://huggingface.co/nicholasKluge/Aira-2-355M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nicholasKluge__Aira-2-355M",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T16:42:47.066460](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-2-355M/blob/main/results_2023-09-13T16-42-47.066460.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2731095353137795,
"acc_stderr": 0.0320570767806455,
"acc_norm": 0.2745852370598572,
"acc_norm_stderr": 0.03206776464118219,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871122,
"mc2": 0.3852958717643368,
"mc2_stderr": 0.014551409809539893
},
"harness|arc:challenge|25": {
"acc": 0.24658703071672355,
"acc_stderr": 0.012595726268790127,
"acc_norm": 0.27559726962457337,
"acc_norm_stderr": 0.013057169655761838
},
"harness|hellaswag|10": {
"acc": 0.3311093407687712,
"acc_stderr": 0.004696505101217406,
"acc_norm": 0.3891655048795061,
"acc_norm_stderr": 0.004865645485910432
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.037857144650666544,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.037857144650666544
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.26973684210526316,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.26973684210526316,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036622,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036622
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3283018867924528,
"acc_stderr": 0.028901593612411784,
"acc_norm": 0.3283018867924528,
"acc_norm_stderr": 0.028901593612411784
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.28901734104046245,
"acc_stderr": 0.034564257450869995,
"acc_norm": 0.28901734104046245,
"acc_norm_stderr": 0.034564257450869995
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.25957446808510637,
"acc_stderr": 0.028659179374292326,
"acc_norm": 0.25957446808510637,
"acc_norm_stderr": 0.028659179374292326
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.03806142687309994,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.03806142687309994
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.022860838309232072,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.022860838309232072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.02458002892148101,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.02458002892148101
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.03269080871970186,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.03269080871970186
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3434343434343434,
"acc_stderr": 0.033832012232444426,
"acc_norm": 0.3434343434343434,
"acc_norm_stderr": 0.033832012232444426
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.35233160621761656,
"acc_stderr": 0.03447478286414359,
"acc_norm": 0.35233160621761656,
"acc_norm_stderr": 0.03447478286414359
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36153846153846153,
"acc_stderr": 0.024359581465396987,
"acc_norm": 0.36153846153846153,
"acc_norm_stderr": 0.024359581465396987
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275805,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275805
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.33613445378151263,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.33613445378151263,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255168,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255168
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3467889908256881,
"acc_stderr": 0.020406097104093027,
"acc_norm": 0.3467889908256881,
"acc_norm_stderr": 0.020406097104093027
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.0338517797604481,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.0338517797604481
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.20675105485232068,
"acc_stderr": 0.026361651668389104,
"acc_norm": 0.20675105485232068,
"acc_norm_stderr": 0.026361651668389104
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.14349775784753363,
"acc_stderr": 0.02352937126961819,
"acc_norm": 0.14349775784753363,
"acc_norm_stderr": 0.02352937126961819
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.14049586776859505,
"acc_stderr": 0.03172233426002161,
"acc_norm": 0.14049586776859505,
"acc_norm_stderr": 0.03172233426002161
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16964285714285715,
"acc_stderr": 0.0356236785009539,
"acc_norm": 0.16964285714285715,
"acc_norm_stderr": 0.0356236785009539
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23931623931623933,
"acc_stderr": 0.027951826808924333,
"acc_norm": 0.23931623931623933,
"acc_norm_stderr": 0.027951826808924333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2260536398467433,
"acc_stderr": 0.01495745850433583,
"acc_norm": 0.2260536398467433,
"acc_norm_stderr": 0.01495745850433583
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23410404624277456,
"acc_stderr": 0.022797110278071138,
"acc_norm": 0.23410404624277456,
"acc_norm_stderr": 0.022797110278071138
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767864,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767864
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27124183006535946,
"acc_stderr": 0.025457756696667874,
"acc_norm": 0.27124183006535946,
"acc_norm_stderr": 0.025457756696667874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2508038585209003,
"acc_stderr": 0.024619771956697165,
"acc_norm": 0.2508038585209003,
"acc_norm_stderr": 0.024619771956697165
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22839506172839505,
"acc_stderr": 0.023358211840626267,
"acc_norm": 0.22839506172839505,
"acc_norm_stderr": 0.023358211840626267
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24468085106382978,
"acc_stderr": 0.025645553622266722,
"acc_norm": 0.24468085106382978,
"acc_norm_stderr": 0.025645553622266722
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2405475880052151,
"acc_stderr": 0.01091640673547895,
"acc_norm": 0.2405475880052151,
"acc_norm_stderr": 0.01091640673547895
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4522058823529412,
"acc_stderr": 0.030233758551596452,
"acc_norm": 0.4522058823529412,
"acc_norm_stderr": 0.030233758551596452
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.03831305140884603,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03831305140884603
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-virology|5": {
"acc": 0.1927710843373494,
"acc_stderr": 0.030709824050565274,
"acc_norm": 0.1927710843373494,
"acc_norm_stderr": 0.030709824050565274
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23391812865497075,
"acc_stderr": 0.03246721765117825,
"acc_norm": 0.23391812865497075,
"acc_norm_stderr": 0.03246721765117825
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871122,
"mc2": 0.3852958717643368,
"mc2_stderr": 0.014551409809539893
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TurkuNLP__gpt3-finnish-small | 2023-09-13T16:49:04.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TurkuNLP/gpt3-finnish-small
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TurkuNLP/gpt3-finnish-small](https://huggingface.co/TurkuNLP/gpt3-finnish-small)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TurkuNLP__gpt3-finnish-small\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T16:47:47.482079](https://huggingface.co/datasets/open-llm-leaderboard/details_TurkuNLP__gpt3-finnish-small/blob/main/results_2023-09-13T16-47-47.482079.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24373999934057838,\n\
\ \"acc_stderr\": 0.0310628658202476,\n \"acc_norm\": 0.24461211305809413,\n\
\ \"acc_norm_stderr\": 0.031080004652671384,\n \"mc1\": 0.2533659730722154,\n\
\ \"mc1_stderr\": 0.01522589934082683,\n \"mc2\": 0.4646530212769539,\n\
\ \"mc2_stderr\": 0.016243370105856934\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.16467576791808874,\n \"acc_stderr\": 0.010838369209479231,\n\
\ \"acc_norm\": 0.20477815699658702,\n \"acc_norm_stderr\": 0.011792544338513402\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2695678151762597,\n\
\ \"acc_stderr\": 0.0044282842101035615,\n \"acc_norm\": 0.280920135431189,\n\
\ \"acc_norm_stderr\": 0.004485300194072271\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24528301886792453,\n \"acc_stderr\": 0.026480357179895702,\n\
\ \"acc_norm\": 0.24528301886792453,\n \"acc_norm_stderr\": 0.026480357179895702\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.15,\n\
\ \"acc_stderr\": 0.03588702812826372,\n \"acc_norm\": 0.15,\n \
\ \"acc_norm_stderr\": 0.03588702812826372\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.03512207412302052,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.03512207412302052\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2967741935483871,\n \"acc_stderr\": 0.025988500792411898,\n \"\
acc_norm\": 0.2967741935483871,\n \"acc_norm_stderr\": 0.025988500792411898\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.17733990147783252,\n \"acc_stderr\": 0.02687433727680835,\n \"\
acc_norm\": 0.17733990147783252,\n \"acc_norm_stderr\": 0.02687433727680835\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.30303030303030304,\n \"acc_stderr\": 0.03588624800091709,\n\
\ \"acc_norm\": 0.30303030303030304,\n \"acc_norm_stderr\": 0.03588624800091709\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.23737373737373738,\n \"acc_stderr\": 0.030313710538198892,\n \"\
acc_norm\": 0.23737373737373738,\n \"acc_norm_stderr\": 0.030313710538198892\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178256,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178256\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2282051282051282,\n \"acc_stderr\": 0.02127839386358628,\n \
\ \"acc_norm\": 0.2282051282051282,\n \"acc_norm_stderr\": 0.02127839386358628\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23949579831932774,\n \"acc_stderr\": 0.027722065493361266,\n\
\ \"acc_norm\": 0.23949579831932774,\n \"acc_norm_stderr\": 0.027722065493361266\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436775,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1944954128440367,\n \"acc_stderr\": 0.016970289090458047,\n \"\
acc_norm\": 0.1944954128440367,\n \"acc_norm_stderr\": 0.016970289090458047\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n\
\ \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.30493273542600896,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.30493273542600896,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794088,\n \"\
acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794088\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2037037037037037,\n\
\ \"acc_stderr\": 0.038935425188248475,\n \"acc_norm\": 0.2037037037037037,\n\
\ \"acc_norm_stderr\": 0.038935425188248475\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2822085889570552,\n \"acc_stderr\": 0.03536117886664743,\n\
\ \"acc_norm\": 0.2822085889570552,\n \"acc_norm_stderr\": 0.03536117886664743\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23116219667943805,\n\
\ \"acc_stderr\": 0.015075523238101088,\n \"acc_norm\": 0.23116219667943805,\n\
\ \"acc_norm_stderr\": 0.015075523238101088\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1832797427652733,\n\
\ \"acc_stderr\": 0.02197419884826581,\n \"acc_norm\": 0.1832797427652733,\n\
\ \"acc_norm_stderr\": 0.02197419884826581\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.22695035460992907,\n \"acc_stderr\": 0.024987106365642973,\n \
\ \"acc_norm\": 0.22695035460992907,\n \"acc_norm_stderr\": 0.024987106365642973\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2542372881355932,\n\
\ \"acc_stderr\": 0.011121129007840668,\n \"acc_norm\": 0.2542372881355932,\n\
\ \"acc_norm_stderr\": 0.011121129007840668\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4264705882352941,\n \"acc_stderr\": 0.030042615832714857,\n\
\ \"acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.030042615832714857\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.20909090909090908,\n \"acc_stderr\": 0.03895091015724136,\n\
\ \"acc_norm\": 0.20909090909090908,\n \"acc_norm_stderr\": 0.03895091015724136\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.1673469387755102,\n\
\ \"acc_stderr\": 0.02389714476891452,\n \"acc_norm\": 0.1673469387755102,\n\
\ \"acc_norm_stderr\": 0.02389714476891452\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.22885572139303484,\n \"acc_stderr\": 0.029705284056772426,\n\
\ \"acc_norm\": 0.22885572139303484,\n \"acc_norm_stderr\": 0.029705284056772426\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.2891566265060241,\n \"acc_stderr\": 0.03529486801511115,\n\
\ \"acc_norm\": 0.2891566265060241,\n \"acc_norm_stderr\": 0.03529486801511115\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.32748538011695905,\n\
\ \"acc_stderr\": 0.035993357714560276,\n \"acc_norm\": 0.32748538011695905,\n\
\ \"acc_norm_stderr\": 0.035993357714560276\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2533659730722154,\n \"mc1_stderr\": 0.01522589934082683,\n\
\ \"mc2\": 0.4646530212769539,\n \"mc2_stderr\": 0.016243370105856934\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TurkuNLP/gpt3-finnish-small
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|arc:challenge|25_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hellaswag|10_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T16-47-47.482079.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T16-47-47.482079.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T16-47-47.482079.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T16-47-47.482079.parquet'
- config_name: results
data_files:
- split: 2023_09_13T16_47_47.482079
path:
- results_2023-09-13T16-47-47.482079.parquet
- split: latest
path:
- results_2023-09-13T16-47-47.482079.parquet
---
# Dataset Card for Evaluation run of TurkuNLP/gpt3-finnish-small
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TurkuNLP/gpt3-finnish-small
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TurkuNLP/gpt3-finnish-small](https://huggingface.co/TurkuNLP/gpt3-finnish-small) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TurkuNLP__gpt3-finnish-small",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T16:47:47.482079](https://huggingface.co/datasets/open-llm-leaderboard/details_TurkuNLP__gpt3-finnish-small/blob/main/results_2023-09-13T16-47-47.482079.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24373999934057838,
"acc_stderr": 0.0310628658202476,
"acc_norm": 0.24461211305809413,
"acc_norm_stderr": 0.031080004652671384,
"mc1": 0.2533659730722154,
"mc1_stderr": 0.01522589934082683,
"mc2": 0.4646530212769539,
"mc2_stderr": 0.016243370105856934
},
"harness|arc:challenge|25": {
"acc": 0.16467576791808874,
"acc_stderr": 0.010838369209479231,
"acc_norm": 0.20477815699658702,
"acc_norm_stderr": 0.011792544338513402
},
"harness|hellaswag|10": {
"acc": 0.2695678151762597,
"acc_stderr": 0.0044282842101035615,
"acc_norm": 0.280920135431189,
"acc_norm_stderr": 0.004485300194072271
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24528301886792453,
"acc_stderr": 0.026480357179895702,
"acc_norm": 0.24528301886792453,
"acc_norm_stderr": 0.026480357179895702
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.15,
"acc_stderr": 0.03588702812826372,
"acc_norm": 0.15,
"acc_norm_stderr": 0.03588702812826372
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.03512207412302052,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.03512207412302052
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2967741935483871,
"acc_stderr": 0.025988500792411898,
"acc_norm": 0.2967741935483871,
"acc_norm_stderr": 0.025988500792411898
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.17733990147783252,
"acc_stderr": 0.02687433727680835,
"acc_norm": 0.17733990147783252,
"acc_norm_stderr": 0.02687433727680835
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.30303030303030304,
"acc_stderr": 0.03588624800091709,
"acc_norm": 0.30303030303030304,
"acc_norm_stderr": 0.03588624800091709
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23737373737373738,
"acc_stderr": 0.030313710538198892,
"acc_norm": 0.23737373737373738,
"acc_norm_stderr": 0.030313710538198892
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178256,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178256
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2282051282051282,
"acc_stderr": 0.02127839386358628,
"acc_norm": 0.2282051282051282,
"acc_norm_stderr": 0.02127839386358628
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23949579831932774,
"acc_stderr": 0.027722065493361266,
"acc_norm": 0.23949579831932774,
"acc_norm_stderr": 0.027722065493361266
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436775,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1944954128440367,
"acc_stderr": 0.016970289090458047,
"acc_norm": 0.1944954128440367,
"acc_norm_stderr": 0.016970289090458047
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2489451476793249,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.2489451476793249,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.30493273542600896,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.30493273542600896,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.038935425188248475,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.038935425188248475
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2822085889570552,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.2822085889570552,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285713,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285713
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23116219667943805,
"acc_stderr": 0.015075523238101088,
"acc_norm": 0.23116219667943805,
"acc_norm_stderr": 0.015075523238101088
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1832797427652733,
"acc_stderr": 0.02197419884826581,
"acc_norm": 0.1832797427652733,
"acc_norm_stderr": 0.02197419884826581
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22695035460992907,
"acc_stderr": 0.024987106365642973,
"acc_norm": 0.22695035460992907,
"acc_norm_stderr": 0.024987106365642973
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2542372881355932,
"acc_stderr": 0.011121129007840668,
"acc_norm": 0.2542372881355932,
"acc_norm_stderr": 0.011121129007840668
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4264705882352941,
"acc_stderr": 0.030042615832714857,
"acc_norm": 0.4264705882352941,
"acc_norm_stderr": 0.030042615832714857
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.03895091015724136,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.03895091015724136
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1673469387755102,
"acc_stderr": 0.02389714476891452,
"acc_norm": 0.1673469387755102,
"acc_norm_stderr": 0.02389714476891452
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22885572139303484,
"acc_stderr": 0.029705284056772426,
"acc_norm": 0.22885572139303484,
"acc_norm_stderr": 0.029705284056772426
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2891566265060241,
"acc_stderr": 0.03529486801511115,
"acc_norm": 0.2891566265060241,
"acc_norm_stderr": 0.03529486801511115
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.32748538011695905,
"acc_stderr": 0.035993357714560276,
"acc_norm": 0.32748538011695905,
"acc_norm_stderr": 0.035993357714560276
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2533659730722154,
"mc1_stderr": 0.01522589934082683,
"mc2": 0.4646530212769539,
"mc2_stderr": 0.016243370105856934
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/yuigahama_yui_yahariorenoseishunlovecomewamachigatteiru | 2023-09-17T17:35:50.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Yuigahama Yui
This is the dataset of Yuigahama Yui, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 674 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 674 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 674 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 674 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_Weyaxi__Luban-Marcoroni-13B | 2023-09-13T17:03:30.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Weyaxi/Luban-Marcoroni-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/Luban-Marcoroni-13B](https://huggingface.co/Weyaxi/Luban-Marcoroni-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Luban-Marcoroni-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T17:02:11.381984](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Luban-Marcoroni-13B/blob/main/results_2023-09-13T17-02-11.381984.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5880928501323799,\n\
\ \"acc_stderr\": 0.034047539642801355,\n \"acc_norm\": 0.5919546775446551,\n\
\ \"acc_norm_stderr\": 0.034026152719071764,\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.017115815632418194,\n \"mc2\": 0.5555366335940738,\n\
\ \"mc2_stderr\": 0.01573177603658692\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6109215017064846,\n \"acc_stderr\": 0.014247309976045607,\n\
\ \"acc_norm\": 0.636518771331058,\n \"acc_norm_stderr\": 0.014056207319068285\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6269667396932882,\n\
\ \"acc_stderr\": 0.00482622478485044,\n \"acc_norm\": 0.8292172873929496,\n\
\ \"acc_norm_stderr\": 0.003755498941781851\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n\
\ \"acc_stderr\": 0.04309732901036356,\n \"acc_norm\": 0.5333333333333333,\n\
\ \"acc_norm_stderr\": 0.04309732901036356\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.032671518489247764,\n\
\ \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.032671518489247764\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336937,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336937\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36507936507936506,\n \"acc_stderr\": 0.02479606060269995,\n \"\
acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.02479606060269995\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7032258064516129,\n\
\ \"acc_stderr\": 0.02598850079241189,\n \"acc_norm\": 0.7032258064516129,\n\
\ \"acc_norm_stderr\": 0.02598850079241189\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.03499113137676744,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.03499113137676744\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932026,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932026\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.027171213683164552,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.027171213683164552\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5666666666666667,\n \"acc_stderr\": 0.025124653525885117,\n\
\ \"acc_norm\": 0.5666666666666667,\n \"acc_norm_stderr\": 0.025124653525885117\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236152,\n\
\ \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236152\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7577981651376147,\n \"acc_stderr\": 0.01836817630659862,\n \"\
acc_norm\": 0.7577981651376147,\n \"acc_norm_stderr\": 0.01836817630659862\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.02786594228663933,\n \"acc_norm\"\
: 0.803921568627451,\n \"acc_norm_stderr\": 0.02786594228663933\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \"\
acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912073,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912073\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489277,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489277\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n\
\ \"acc_stderr\": 0.014836205167333562,\n \"acc_norm\": 0.7790549169859514,\n\
\ \"acc_norm_stderr\": 0.014836205167333562\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4212290502793296,\n\
\ \"acc_stderr\": 0.0165136760311796,\n \"acc_norm\": 0.4212290502793296,\n\
\ \"acc_norm_stderr\": 0.0165136760311796\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.027245613047215365,\n\
\ \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.027245613047215365\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\
\ \"acc_stderr\": 0.02698147804364804,\n \"acc_norm\": 0.6559485530546624,\n\
\ \"acc_norm_stderr\": 0.02698147804364804\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02622964917882116,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02622964917882116\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.02968010556502904,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.02968010556502904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42633637548891784,\n\
\ \"acc_stderr\": 0.012630884771599694,\n \"acc_norm\": 0.42633637548891784,\n\
\ \"acc_norm_stderr\": 0.012630884771599694\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5669934640522876,\n \"acc_stderr\": 0.020045442473324224,\n \
\ \"acc_norm\": 0.5669934640522876,\n \"acc_norm_stderr\": 0.020045442473324224\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653693,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653693\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.0389136449583582,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.0389136449583582\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.017115815632418194,\n \"mc2\": 0.5555366335940738,\n\
\ \"mc2_stderr\": 0.01573177603658692\n }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/Luban-Marcoroni-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|arc:challenge|25_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hellaswag|10_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T17-02-11.381984.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T17-02-11.381984.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T17-02-11.381984.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T17-02-11.381984.parquet'
- config_name: results
data_files:
- split: 2023_09_13T17_02_11.381984
path:
- results_2023-09-13T17-02-11.381984.parquet
- split: latest
path:
- results_2023-09-13T17-02-11.381984.parquet
---
# Dataset Card for Evaluation run of Weyaxi/Luban-Marcoroni-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Weyaxi/Luban-Marcoroni-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Weyaxi/Luban-Marcoroni-13B](https://huggingface.co/Weyaxi/Luban-Marcoroni-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Luban-Marcoroni-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T17:02:11.381984](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Luban-Marcoroni-13B/blob/main/results_2023-09-13T17-02-11.381984.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5880928501323799,
"acc_stderr": 0.034047539642801355,
"acc_norm": 0.5919546775446551,
"acc_norm_stderr": 0.034026152719071764,
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418194,
"mc2": 0.5555366335940738,
"mc2_stderr": 0.01573177603658692
},
"harness|arc:challenge|25": {
"acc": 0.6109215017064846,
"acc_stderr": 0.014247309976045607,
"acc_norm": 0.636518771331058,
"acc_norm_stderr": 0.014056207319068285
},
"harness|hellaswag|10": {
"acc": 0.6269667396932882,
"acc_stderr": 0.00482622478485044,
"acc_norm": 0.8292172873929496,
"acc_norm_stderr": 0.003755498941781851
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.04309732901036356,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.04309732901036356
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336937,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336937
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.02479606060269995,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.02479606060269995
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7032258064516129,
"acc_stderr": 0.02598850079241189,
"acc_norm": 0.7032258064516129,
"acc_norm_stderr": 0.02598850079241189
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932026,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932026
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.027171213683164552,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.027171213683164552
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5666666666666667,
"acc_stderr": 0.025124653525885117,
"acc_norm": 0.5666666666666667,
"acc_norm_stderr": 0.025124653525885117
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5798319327731093,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.5798319327731093,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7577981651376147,
"acc_stderr": 0.01836817630659862,
"acc_norm": 0.7577981651376147,
"acc_norm_stderr": 0.01836817630659862
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39351851851851855,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.39351851851851855,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.02786594228663933,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.02786594228663933
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912073,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912073
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489277,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489277
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7790549169859514,
"acc_stderr": 0.014836205167333562,
"acc_norm": 0.7790549169859514,
"acc_norm_stderr": 0.014836205167333562
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.02607431485165708,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.02607431485165708
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4212290502793296,
"acc_stderr": 0.0165136760311796,
"acc_norm": 0.4212290502793296,
"acc_norm_stderr": 0.0165136760311796
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.027245613047215365,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.027245613047215365
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.02698147804364804,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.02698147804364804
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02622964917882116,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02622964917882116
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.02968010556502904,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.02968010556502904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42633637548891784,
"acc_stderr": 0.012630884771599694,
"acc_norm": 0.42633637548891784,
"acc_norm_stderr": 0.012630884771599694
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5625,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5669934640522876,
"acc_stderr": 0.020045442473324224,
"acc_norm": 0.5669934640522876,
"acc_norm_stderr": 0.020045442473324224
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653693,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653693
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.0389136449583582,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.0389136449583582
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418194,
"mc2": 0.5555366335940738,
"mc2_stderr": 0.01573177603658692
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
u6ujh6h6/gg | 2023-09-13T17:06:15.000Z | [
"license:unknown",
"region:us"
] | u6ujh6h6 | null | null | null | 0 | 0 | ---
license: unknown
---
|
Khoai/ms | 2023-09-13T17:07:31.000Z | [
"region:us"
] | Khoai | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_elliotthwang__Elliott-Chinese-LLaMa-GPTQ | 2023-09-13T17:16:53.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of elliotthwang/Elliott-Chinese-LLaMa-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [elliotthwang/Elliott-Chinese-LLaMa-GPTQ](https://huggingface.co/elliotthwang/Elliott-Chinese-LLaMa-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_elliotthwang__Elliott-Chinese-LLaMa-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T17:15:37.349272](https://huggingface.co/datasets/open-llm-leaderboard/details_elliotthwang__Elliott-Chinese-LLaMa-GPTQ/blob/main/results_2023-09-13T17-15-37.349272.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4964016914772728,\n\
\ \"acc_stderr\": 0.035114825047806095,\n \"acc_norm\": 0.500432917901348,\n\
\ \"acc_norm_stderr\": 0.03510428827901411,\n \"mc1\": 0.2962056303549572,\n\
\ \"mc1_stderr\": 0.0159835951018114,\n \"mc2\": 0.45093918484971685,\n\
\ \"mc2_stderr\": 0.014738817903940389\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.46757679180887374,\n \"acc_stderr\": 0.014580637569995423,\n\
\ \"acc_norm\": 0.5102389078498294,\n \"acc_norm_stderr\": 0.014608326906285015\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5571599283011353,\n\
\ \"acc_stderr\": 0.004957068377516512,\n \"acc_norm\": 0.7523401712806214,\n\
\ \"acc_norm_stderr\": 0.004307709682499536\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.04065771002562605,\n\
\ \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.04065771002562605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.539622641509434,\n \"acc_stderr\": 0.030676096599389184,\n\
\ \"acc_norm\": 0.539622641509434,\n \"acc_norm_stderr\": 0.030676096599389184\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5069444444444444,\n\
\ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.5069444444444444,\n\
\ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n\
\ \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.43352601156069365,\n\
\ \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n\
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30687830687830686,\n \"acc_stderr\": 0.023752928712112143,\n \"\
acc_norm\": 0.30687830687830686,\n \"acc_norm_stderr\": 0.023752928712112143\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5612903225806452,\n\
\ \"acc_stderr\": 0.028229497320317213,\n \"acc_norm\": 0.5612903225806452,\n\
\ \"acc_norm_stderr\": 0.028229497320317213\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3251231527093596,\n \"acc_stderr\": 0.032957975663112704,\n\
\ \"acc_norm\": 0.3251231527093596,\n \"acc_norm_stderr\": 0.032957975663112704\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6464646464646465,\n \"acc_stderr\": 0.03406086723547155,\n \"\
acc_norm\": 0.6464646464646465,\n \"acc_norm_stderr\": 0.03406086723547155\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7357512953367875,\n \"acc_stderr\": 0.031821550509166456,\n\
\ \"acc_norm\": 0.7357512953367875,\n \"acc_norm_stderr\": 0.031821550509166456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.45384615384615384,\n \"acc_stderr\": 0.025242770987126174,\n\
\ \"acc_norm\": 0.45384615384615384,\n \"acc_norm_stderr\": 0.025242770987126174\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.44537815126050423,\n \"acc_stderr\": 0.0322841062671639,\n \
\ \"acc_norm\": 0.44537815126050423,\n \"acc_norm_stderr\": 0.0322841062671639\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.689908256880734,\n \"acc_stderr\": 0.019830849684439756,\n \"\
acc_norm\": 0.689908256880734,\n \"acc_norm_stderr\": 0.019830849684439756\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.03293377139415191,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.03293377139415191\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6225490196078431,\n \"acc_stderr\": 0.03402272044340703,\n \"\
acc_norm\": 0.6225490196078431,\n \"acc_norm_stderr\": 0.03402272044340703\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6708860759493671,\n \"acc_stderr\": 0.030587326294702368,\n \
\ \"acc_norm\": 0.6708860759493671,\n \"acc_norm_stderr\": 0.030587326294702368\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5874439461883408,\n\
\ \"acc_stderr\": 0.03304062175449296,\n \"acc_norm\": 0.5874439461883408,\n\
\ \"acc_norm_stderr\": 0.03304062175449296\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\
: 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n\
\ \"acc_stderr\": 0.04812917324536823,\n \"acc_norm\": 0.5462962962962963,\n\
\ \"acc_norm_stderr\": 0.04812917324536823\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5276073619631901,\n \"acc_stderr\": 0.0392237829061099,\n\
\ \"acc_norm\": 0.5276073619631901,\n \"acc_norm_stderr\": 0.0392237829061099\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7393162393162394,\n\
\ \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.7393162393162394,\n\
\ \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6973180076628352,\n\
\ \"acc_stderr\": 0.016428781581749364,\n \"acc_norm\": 0.6973180076628352,\n\
\ \"acc_norm_stderr\": 0.016428781581749364\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5028901734104047,\n \"acc_stderr\": 0.026918645383239004,\n\
\ \"acc_norm\": 0.5028901734104047,\n \"acc_norm_stderr\": 0.026918645383239004\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.288268156424581,\n\
\ \"acc_stderr\": 0.015149132860209427,\n \"acc_norm\": 0.288268156424581,\n\
\ \"acc_norm_stderr\": 0.015149132860209427\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.028358956313423552,\n\
\ \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.028358956313423552\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.572347266881029,\n\
\ \"acc_stderr\": 0.02809924077580956,\n \"acc_norm\": 0.572347266881029,\n\
\ \"acc_norm_stderr\": 0.02809924077580956\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.027513747284379417,\n\
\ \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.027513747284379417\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36524822695035464,\n \"acc_stderr\": 0.02872386385328128,\n \
\ \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.02872386385328128\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34028683181225555,\n\
\ \"acc_stderr\": 0.012101217610223786,\n \"acc_norm\": 0.34028683181225555,\n\
\ \"acc_norm_stderr\": 0.012101217610223786\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.03034326422421352,\n\
\ \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.03034326422421352\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4738562091503268,\n \"acc_stderr\": 0.020200164564804588,\n \
\ \"acc_norm\": 0.4738562091503268,\n \"acc_norm_stderr\": 0.020200164564804588\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5306122448979592,\n \"acc_stderr\": 0.031949171367580624,\n\
\ \"acc_norm\": 0.5306122448979592,\n \"acc_norm_stderr\": 0.031949171367580624\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7064676616915423,\n\
\ \"acc_stderr\": 0.03220024104534204,\n \"acc_norm\": 0.7064676616915423,\n\
\ \"acc_norm_stderr\": 0.03220024104534204\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2962056303549572,\n\
\ \"mc1_stderr\": 0.0159835951018114,\n \"mc2\": 0.45093918484971685,\n\
\ \"mc2_stderr\": 0.014738817903940389\n }\n}\n```"
repo_url: https://huggingface.co/elliotthwang/Elliott-Chinese-LLaMa-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|arc:challenge|25_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hellaswag|10_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T17-15-37.349272.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T17-15-37.349272.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T17-15-37.349272.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T17-15-37.349272.parquet'
- config_name: results
data_files:
- split: 2023_09_13T17_15_37.349272
path:
- results_2023-09-13T17-15-37.349272.parquet
- split: latest
path:
- results_2023-09-13T17-15-37.349272.parquet
---
# Dataset Card for Evaluation run of elliotthwang/Elliott-Chinese-LLaMa-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/elliotthwang/Elliott-Chinese-LLaMa-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [elliotthwang/Elliott-Chinese-LLaMa-GPTQ](https://huggingface.co/elliotthwang/Elliott-Chinese-LLaMa-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_elliotthwang__Elliott-Chinese-LLaMa-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T17:15:37.349272](https://huggingface.co/datasets/open-llm-leaderboard/details_elliotthwang__Elliott-Chinese-LLaMa-GPTQ/blob/main/results_2023-09-13T17-15-37.349272.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4964016914772728,
"acc_stderr": 0.035114825047806095,
"acc_norm": 0.500432917901348,
"acc_norm_stderr": 0.03510428827901411,
"mc1": 0.2962056303549572,
"mc1_stderr": 0.0159835951018114,
"mc2": 0.45093918484971685,
"mc2_stderr": 0.014738817903940389
},
"harness|arc:challenge|25": {
"acc": 0.46757679180887374,
"acc_stderr": 0.014580637569995423,
"acc_norm": 0.5102389078498294,
"acc_norm_stderr": 0.014608326906285015
},
"harness|hellaswag|10": {
"acc": 0.5571599283011353,
"acc_stderr": 0.004957068377516512,
"acc_norm": 0.7523401712806214,
"acc_norm_stderr": 0.004307709682499536
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5197368421052632,
"acc_stderr": 0.04065771002562605,
"acc_norm": 0.5197368421052632,
"acc_norm_stderr": 0.04065771002562605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.539622641509434,
"acc_stderr": 0.030676096599389184,
"acc_norm": 0.539622641509434,
"acc_norm_stderr": 0.030676096599389184
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5069444444444444,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.5069444444444444,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.03778621079092056,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.03778621079092056
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30687830687830686,
"acc_stderr": 0.023752928712112143,
"acc_norm": 0.30687830687830686,
"acc_norm_stderr": 0.023752928712112143
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5612903225806452,
"acc_stderr": 0.028229497320317213,
"acc_norm": 0.5612903225806452,
"acc_norm_stderr": 0.028229497320317213
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3251231527093596,
"acc_stderr": 0.032957975663112704,
"acc_norm": 0.3251231527093596,
"acc_norm_stderr": 0.032957975663112704
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6464646464646465,
"acc_stderr": 0.03406086723547155,
"acc_norm": 0.6464646464646465,
"acc_norm_stderr": 0.03406086723547155
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7357512953367875,
"acc_stderr": 0.031821550509166456,
"acc_norm": 0.7357512953367875,
"acc_norm_stderr": 0.031821550509166456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.45384615384615384,
"acc_stderr": 0.025242770987126174,
"acc_norm": 0.45384615384615384,
"acc_norm_stderr": 0.025242770987126174
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.44537815126050423,
"acc_stderr": 0.0322841062671639,
"acc_norm": 0.44537815126050423,
"acc_norm_stderr": 0.0322841062671639
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.689908256880734,
"acc_stderr": 0.019830849684439756,
"acc_norm": 0.689908256880734,
"acc_norm_stderr": 0.019830849684439756
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.03293377139415191,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.03293377139415191
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6225490196078431,
"acc_stderr": 0.03402272044340703,
"acc_norm": 0.6225490196078431,
"acc_norm_stderr": 0.03402272044340703
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6708860759493671,
"acc_stderr": 0.030587326294702368,
"acc_norm": 0.6708860759493671,
"acc_norm_stderr": 0.030587326294702368
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5874439461883408,
"acc_stderr": 0.03304062175449296,
"acc_norm": 0.5874439461883408,
"acc_norm_stderr": 0.03304062175449296
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536823,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536823
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5276073619631901,
"acc_stderr": 0.0392237829061099,
"acc_norm": 0.5276073619631901,
"acc_norm_stderr": 0.0392237829061099
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7393162393162394,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.7393162393162394,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6973180076628352,
"acc_stderr": 0.016428781581749364,
"acc_norm": 0.6973180076628352,
"acc_norm_stderr": 0.016428781581749364
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.026918645383239004,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.026918645383239004
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.288268156424581,
"acc_stderr": 0.015149132860209427,
"acc_norm": 0.288268156424581,
"acc_norm_stderr": 0.015149132860209427
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.028358956313423552,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.028358956313423552
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.572347266881029,
"acc_stderr": 0.02809924077580956,
"acc_norm": 0.572347266881029,
"acc_norm_stderr": 0.02809924077580956
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.027513747284379417,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.027513747284379417
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.02872386385328128,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.02872386385328128
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34028683181225555,
"acc_stderr": 0.012101217610223786,
"acc_norm": 0.34028683181225555,
"acc_norm_stderr": 0.012101217610223786
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5220588235294118,
"acc_stderr": 0.03034326422421352,
"acc_norm": 0.5220588235294118,
"acc_norm_stderr": 0.03034326422421352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4738562091503268,
"acc_stderr": 0.020200164564804588,
"acc_norm": 0.4738562091503268,
"acc_norm_stderr": 0.020200164564804588
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5306122448979592,
"acc_stderr": 0.031949171367580624,
"acc_norm": 0.5306122448979592,
"acc_norm_stderr": 0.031949171367580624
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7064676616915423,
"acc_stderr": 0.03220024104534204,
"acc_norm": 0.7064676616915423,
"acc_norm_stderr": 0.03220024104534204
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2962056303549572,
"mc1_stderr": 0.0159835951018114,
"mc2": 0.45093918484971685,
"mc2_stderr": 0.014738817903940389
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/kamiya_nao_idolmastercinderellagirls | 2023-09-17T17:35:52.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kamiya_nao (THE iDOLM@STER: Cinderella Girls)
This is the dataset of kamiya_nao (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 510 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 510 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 510 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 510 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/hiratsuka_shizuka_yahariorenoseishunlovecomewamachigatteiru | 2023-09-17T17:35:54.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Hiratsuka Shizuka
This is the dataset of Hiratsuka Shizuka, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 680 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 680 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 680 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 680 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/yukinoshita_haruno_yahariorenoseishunlovecomewamachigatteiru | 2023-09-17T17:35:57.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Yukinoshita Haruno
This is the dataset of Yukinoshita Haruno, containing 227 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 227 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 509 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 227 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 227 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 227 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 227 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 227 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 509 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 509 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 509 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_TurkuNLP__gpt3-finnish-13B | 2023-09-13T17:56:31.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TurkuNLP/gpt3-finnish-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TurkuNLP/gpt3-finnish-13B](https://huggingface.co/TurkuNLP/gpt3-finnish-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TurkuNLP__gpt3-finnish-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T17:54:19.739252](https://huggingface.co/datasets/open-llm-leaderboard/details_TurkuNLP__gpt3-finnish-13B/blob/main/results_2023-09-13T17-54-19.739252.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23729125190541017,\n\
\ \"acc_stderr\": 0.030790729991307567,\n \"acc_norm\": 0.239047780887342,\n\
\ \"acc_norm_stderr\": 0.030794370568286784,\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.44465058814303504,\n\
\ \"mc2_stderr\": 0.014989738570716185\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2440273037542662,\n \"acc_stderr\": 0.012551447627856259,\n\
\ \"acc_norm\": 0.24658703071672355,\n \"acc_norm_stderr\": 0.01259572626879013\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.36656044612626965,\n\
\ \"acc_stderr\": 0.004808802114592824,\n \"acc_norm\": 0.4676359290977893,\n\
\ \"acc_norm_stderr\": 0.0049793175154325305\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2740740740740741,\n\
\ \"acc_stderr\": 0.03853254836552004,\n \"acc_norm\": 0.2740740740740741,\n\
\ \"acc_norm_stderr\": 0.03853254836552004\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677077,\n\
\ \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677077\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.16,\n \"acc_stderr\": 0.0368452949177471,\n \"acc_norm\"\
: 0.16,\n \"acc_norm_stderr\": 0.0368452949177471\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322674,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322674\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\
\ \"acc_stderr\": 0.031265112061730424,\n \"acc_norm\": 0.2138728323699422,\n\
\ \"acc_norm_stderr\": 0.031265112061730424\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2723404255319149,\n \"acc_stderr\": 0.029101290698386705,\n\
\ \"acc_norm\": 0.2723404255319149,\n \"acc_norm_stderr\": 0.029101290698386705\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.19298245614035087,\n\
\ \"acc_stderr\": 0.037124548537213684,\n \"acc_norm\": 0.19298245614035087,\n\
\ \"acc_norm_stderr\": 0.037124548537213684\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.21379310344827587,\n \"acc_stderr\": 0.03416520447747548,\n\
\ \"acc_norm\": 0.21379310344827587,\n \"acc_norm_stderr\": 0.03416520447747548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.02226181769240017,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.02226181769240017\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.03764950879790609,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.03764950879790609\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22580645161290322,\n\
\ \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.22580645161290322,\n\
\ \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.19704433497536947,\n \"acc_stderr\": 0.027986724666736212,\n\
\ \"acc_norm\": 0.19704433497536947,\n \"acc_norm_stderr\": 0.027986724666736212\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.19696969696969696,\n \"acc_stderr\": 0.028335609732463355,\n \"\
acc_norm\": 0.19696969696969696,\n \"acc_norm_stderr\": 0.028335609732463355\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21243523316062177,\n \"acc_stderr\": 0.029519282616817227,\n\
\ \"acc_norm\": 0.21243523316062177,\n \"acc_norm_stderr\": 0.029519282616817227\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2358974358974359,\n \"acc_stderr\": 0.02152596540740872,\n \
\ \"acc_norm\": 0.2358974358974359,\n \"acc_norm_stderr\": 0.02152596540740872\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23703703703703705,\n \"acc_stderr\": 0.02592887613276611,\n \
\ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.02592887613276611\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.027553614467863814,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.027553614467863814\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.15894039735099338,\n \"acc_stderr\": 0.029852788528701008,\n \"\
acc_norm\": 0.15894039735099338,\n \"acc_norm_stderr\": 0.029852788528701008\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22752293577981653,\n \"acc_stderr\": 0.017974463578776502,\n \"\
acc_norm\": 0.22752293577981653,\n \"acc_norm_stderr\": 0.017974463578776502\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134217,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134217\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n\
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34977578475336324,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.34977578475336324,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2085889570552147,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.2085889570552147,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.22330097087378642,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.22330097087378642,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.29914529914529914,\n\
\ \"acc_stderr\": 0.029996951858349497,\n \"acc_norm\": 0.29914529914529914,\n\
\ \"acc_norm_stderr\": 0.029996951858349497\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.25287356321839083,\n\
\ \"acc_stderr\": 0.01554337731371968,\n \"acc_norm\": 0.25287356321839083,\n\
\ \"acc_norm_stderr\": 0.01554337731371968\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023805186524888146,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023805186524888146\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2540192926045016,\n\
\ \"acc_stderr\": 0.02472386150477169,\n \"acc_norm\": 0.2540192926045016,\n\
\ \"acc_norm_stderr\": 0.02472386150477169\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2716049382716049,\n \"acc_stderr\": 0.024748624490537365,\n\
\ \"acc_norm\": 0.2716049382716049,\n \"acc_norm_stderr\": 0.024748624490537365\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2127659574468085,\n \"acc_stderr\": 0.024414612974307706,\n \
\ \"acc_norm\": 0.2127659574468085,\n \"acc_norm_stderr\": 0.024414612974307706\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24837027379400262,\n\
\ \"acc_stderr\": 0.011035212598034498,\n \"acc_norm\": 0.24837027379400262,\n\
\ \"acc_norm_stderr\": 0.011035212598034498\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.22426470588235295,\n \"acc_stderr\": 0.02533684856333236,\n\
\ \"acc_norm\": 0.22426470588235295,\n \"acc_norm_stderr\": 0.02533684856333236\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25326797385620914,\n \"acc_stderr\": 0.01759348689536683,\n \
\ \"acc_norm\": 0.25326797385620914,\n \"acc_norm_stderr\": 0.01759348689536683\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2545454545454545,\n\
\ \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.2545454545454545,\n\
\ \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17142857142857143,\n \"acc_stderr\": 0.02412746346265015,\n\
\ \"acc_norm\": 0.17142857142857143,\n \"acc_norm_stderr\": 0.02412746346265015\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.030360490154014645,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.030360490154014645\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n\
\ \"acc_stderr\": 0.03571609230053481,\n \"acc_norm\": 0.30120481927710846,\n\
\ \"acc_norm_stderr\": 0.03571609230053481\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.031267817146631786,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.031267817146631786\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.44465058814303504,\n\
\ \"mc2_stderr\": 0.014989738570716185\n }\n}\n```"
repo_url: https://huggingface.co/TurkuNLP/gpt3-finnish-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|arc:challenge|25_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hellaswag|10_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T17-54-19.739252.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T17-54-19.739252.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T17-54-19.739252.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T17-54-19.739252.parquet'
- config_name: results
data_files:
- split: 2023_09_13T17_54_19.739252
path:
- results_2023-09-13T17-54-19.739252.parquet
- split: latest
path:
- results_2023-09-13T17-54-19.739252.parquet
---
# Dataset Card for Evaluation run of TurkuNLP/gpt3-finnish-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TurkuNLP/gpt3-finnish-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TurkuNLP/gpt3-finnish-13B](https://huggingface.co/TurkuNLP/gpt3-finnish-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TurkuNLP__gpt3-finnish-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T17:54:19.739252](https://huggingface.co/datasets/open-llm-leaderboard/details_TurkuNLP__gpt3-finnish-13B/blob/main/results_2023-09-13T17-54-19.739252.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23729125190541017,
"acc_stderr": 0.030790729991307567,
"acc_norm": 0.239047780887342,
"acc_norm_stderr": 0.030794370568286784,
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.44465058814303504,
"mc2_stderr": 0.014989738570716185
},
"harness|arc:challenge|25": {
"acc": 0.2440273037542662,
"acc_stderr": 0.012551447627856259,
"acc_norm": 0.24658703071672355,
"acc_norm_stderr": 0.01259572626879013
},
"harness|hellaswag|10": {
"acc": 0.36656044612626965,
"acc_stderr": 0.004808802114592824,
"acc_norm": 0.4676359290977893,
"acc_norm_stderr": 0.0049793175154325305
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.03853254836552004,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.03853254836552004
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17105263157894737,
"acc_stderr": 0.030643607071677077,
"acc_norm": 0.17105263157894737,
"acc_norm_stderr": 0.030643607071677077
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.16,
"acc_stderr": 0.0368452949177471,
"acc_norm": 0.16,
"acc_norm_stderr": 0.0368452949177471
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322674,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322674
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.031265112061730424,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.031265112061730424
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2723404255319149,
"acc_stderr": 0.029101290698386705,
"acc_norm": 0.2723404255319149,
"acc_norm_stderr": 0.029101290698386705
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.19298245614035087,
"acc_stderr": 0.037124548537213684,
"acc_norm": 0.19298245614035087,
"acc_norm_stderr": 0.037124548537213684
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.21379310344827587,
"acc_stderr": 0.03416520447747548,
"acc_norm": 0.21379310344827587,
"acc_norm_stderr": 0.03416520447747548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.02226181769240017,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.02226181769240017
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790609,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790609
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22580645161290322,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.22580645161290322,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.19704433497536947,
"acc_stderr": 0.027986724666736212,
"acc_norm": 0.19704433497536947,
"acc_norm_stderr": 0.027986724666736212
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.19696969696969696,
"acc_stderr": 0.028335609732463355,
"acc_norm": 0.19696969696969696,
"acc_norm_stderr": 0.028335609732463355
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21243523316062177,
"acc_stderr": 0.029519282616817227,
"acc_norm": 0.21243523316062177,
"acc_norm_stderr": 0.029519282616817227
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2358974358974359,
"acc_stderr": 0.02152596540740872,
"acc_norm": 0.2358974358974359,
"acc_norm_stderr": 0.02152596540740872
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.02592887613276611,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.02592887613276611
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.027553614467863814,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.027553614467863814
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.15894039735099338,
"acc_stderr": 0.029852788528701008,
"acc_norm": 0.15894039735099338,
"acc_norm_stderr": 0.029852788528701008
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22752293577981653,
"acc_stderr": 0.017974463578776502,
"acc_norm": 0.22752293577981653,
"acc_norm_stderr": 0.017974463578776502
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134217,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134217
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34977578475336324,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.34977578475336324,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2085889570552147,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.2085889570552147,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.22330097087378642,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.22330097087378642,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.29914529914529914,
"acc_stderr": 0.029996951858349497,
"acc_norm": 0.29914529914529914,
"acc_norm_stderr": 0.029996951858349497
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.25287356321839083,
"acc_stderr": 0.01554337731371968,
"acc_norm": 0.25287356321839083,
"acc_norm_stderr": 0.01554337731371968
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023805186524888146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023805186524888146
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2540192926045016,
"acc_stderr": 0.02472386150477169,
"acc_norm": 0.2540192926045016,
"acc_norm_stderr": 0.02472386150477169
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2716049382716049,
"acc_stderr": 0.024748624490537365,
"acc_norm": 0.2716049382716049,
"acc_norm_stderr": 0.024748624490537365
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2127659574468085,
"acc_stderr": 0.024414612974307706,
"acc_norm": 0.2127659574468085,
"acc_norm_stderr": 0.024414612974307706
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24837027379400262,
"acc_stderr": 0.011035212598034498,
"acc_norm": 0.24837027379400262,
"acc_norm_stderr": 0.011035212598034498
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.22426470588235295,
"acc_stderr": 0.02533684856333236,
"acc_norm": 0.22426470588235295,
"acc_norm_stderr": 0.02533684856333236
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25326797385620914,
"acc_stderr": 0.01759348689536683,
"acc_norm": 0.25326797385620914,
"acc_norm_stderr": 0.01759348689536683
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17142857142857143,
"acc_stderr": 0.02412746346265015,
"acc_norm": 0.17142857142857143,
"acc_norm_stderr": 0.02412746346265015
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014645,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014645
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.03571609230053481,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.03571609230053481
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.44465058814303504,
"mc2_stderr": 0.014989738570716185
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/olivia_asobiasobase | 2023-09-17T17:35:59.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Olivia
This is the dataset of Olivia, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 641 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 641 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 641 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 641 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/hikigaya_komachi_yahariorenoseishunlovecomewamachigatteiru | 2023-09-17T17:36:01.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Hikigaya Komachi
This is the dataset of Hikigaya Komachi, containing 220 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 220 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 472 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 220 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 220 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 220 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 220 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 220 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 472 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 472 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 472 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/ebina_hina_yahariorenoseishunlovecomewamachigatteiru | 2023-09-17T17:36:03.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Ebina Hina
This is the dataset of Ebina Hina, containing 149 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 149 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 353 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 149 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 149 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 149 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 149 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 149 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 353 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 353 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 353 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
LawBERT-tw/LawBERT_embedding | 2023-09-13T18:14:02.000Z | [
"region:us"
] | LawBERT-tw | null | null | null | 0 | 0 | Entry not found |
golightly/comparison-data-falcon | 2023-09-13T19:29:42.000Z | [
"region:us"
] | golightly | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: instruction
dtype: string
id: field
- name: response-1
dtype: string
id: field
- name: response-2
dtype: string
id: field
- name: choose-best
list:
- name: user_id
dtype: string
id: question
- name: value
dtype: int32
id: suggestion
- name: status
dtype: string
id: question
- name: choose-best-suggestion
dtype: int32
id: suggestion
- name: choose-best-suggestion-metadata
struct:
- name: type
dtype: string
id: suggestion-metadata
- name: score
dtype: float32
id: suggestion-metadata
- name: agent
dtype: string
id: suggestion-metadata
- name: external_id
dtype: string
id: external_id
- name: metadata
dtype: string
id: metadata
splits:
- name: train
num_bytes: 8163688
num_examples: 7401
download_size: 0
dataset_size: 8163688
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "comparison-data-falcon"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/hanako_honda_asobiasobase | 2023-09-17T17:36:05.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Hanako Honda
This is the dataset of Hanako Honda, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 674 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 674 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 674 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 674 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/isshiki_iroha_yahariorenoseishunlovecomewamachigatteiru | 2023-09-17T17:36:07.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Isshiki Iroha
This is the dataset of Isshiki Iroha, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 675 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 675 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 675 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 675 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/kobayakawa_sae_idolmastercinderellagirls | 2023-09-17T17:36:09.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kobayakawa_sae (THE iDOLM@STER: Cinderella Girls)
This is the dataset of kobayakawa_sae (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 513 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 513 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 513 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 513 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_Secbone__llama-2-13B-instructed | 2023-09-13T18:40:36.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Secbone/llama-2-13B-instructed
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Secbone/llama-2-13B-instructed](https://huggingface.co/Secbone/llama-2-13B-instructed)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Secbone__llama-2-13B-instructed\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T18:39:18.624923](https://huggingface.co/datasets/open-llm-leaderboard/details_Secbone__llama-2-13B-instructed/blob/main/results_2023-09-13T18-39-18.624923.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5572252571915645,\n\
\ \"acc_stderr\": 0.03446062872001509,\n \"acc_norm\": 0.5611308557600665,\n\
\ \"acc_norm_stderr\": 0.03443910832643205,\n \"mc1\": 0.3219094247246022,\n\
\ \"mc1_stderr\": 0.0163555676119604,\n \"mc2\": 0.4688545512760683,\n\
\ \"mc2_stderr\": 0.015990152764591564\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5503412969283277,\n \"acc_stderr\": 0.014537144444284743,\n\
\ \"acc_norm\": 0.5938566552901023,\n \"acc_norm_stderr\": 0.01435165669009786\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6518621788488349,\n\
\ \"acc_stderr\": 0.0047540638677001775,\n \"acc_norm\": 0.8387771360286795,\n\
\ \"acc_norm_stderr\": 0.0036698484004877773\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.04026097083296564,\n\
\ \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.04026097083296564\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.630188679245283,\n \"acc_stderr\": 0.029711421880107933,\n\
\ \"acc_norm\": 0.630188679245283,\n \"acc_norm_stderr\": 0.029711421880107933\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467381,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467381\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.04372748290278007,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.04372748290278007\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3492063492063492,\n \"acc_stderr\": 0.024552292209342658,\n \"\
acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.024552292209342658\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6645161290322581,\n \"acc_stderr\": 0.02686020644472434,\n \"\
acc_norm\": 0.6645161290322581,\n \"acc_norm_stderr\": 0.02686020644472434\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n \"\
acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6818181818181818,\n \"acc_stderr\": 0.0331847733384533,\n \"acc_norm\"\
: 0.6818181818181818,\n \"acc_norm_stderr\": 0.0331847733384533\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.02951928261681724,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.02951928261681724\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5487179487179488,\n \"acc_stderr\": 0.025230381238934833,\n\
\ \"acc_norm\": 0.5487179487179488,\n \"acc_norm_stderr\": 0.025230381238934833\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7394495412844037,\n \"acc_stderr\": 0.01881918203485007,\n \"\
acc_norm\": 0.7394495412844037,\n \"acc_norm_stderr\": 0.01881918203485007\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696042,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696042\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7257383966244726,\n \"acc_stderr\": 0.02904133351059804,\n \
\ \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.02904133351059804\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6776859504132231,\n \"acc_stderr\": 0.042664163633521685,\n \"\
acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.042664163633521685\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864616,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864616\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326467,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326467\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.025819233256483717,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.025819233256483717\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7598978288633461,\n\
\ \"acc_stderr\": 0.015274685213734191,\n \"acc_norm\": 0.7598978288633461,\n\
\ \"acc_norm_stderr\": 0.015274685213734191\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.025992472029306393,\n\
\ \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.025992472029306393\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40670391061452515,\n\
\ \"acc_stderr\": 0.016428811915898865,\n \"acc_norm\": 0.40670391061452515,\n\
\ \"acc_norm_stderr\": 0.016428811915898865\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5620915032679739,\n \"acc_stderr\": 0.02840830202033269,\n\
\ \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.02840830202033269\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n\
\ \"acc_stderr\": 0.027098652621301754,\n \"acc_norm\": 0.6495176848874598,\n\
\ \"acc_norm_stderr\": 0.027098652621301754\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.025773111169630443,\n\
\ \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.025773111169630443\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4348109517601043,\n\
\ \"acc_stderr\": 0.012661233805616302,\n \"acc_norm\": 0.4348109517601043,\n\
\ \"acc_norm_stderr\": 0.012661233805616302\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5514705882352942,\n \"acc_stderr\": 0.030211479609121596,\n\
\ \"acc_norm\": 0.5514705882352942,\n \"acc_norm_stderr\": 0.030211479609121596\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5882352941176471,\n \"acc_stderr\": 0.019910377463105932,\n \
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.019910377463105932\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.031067211262872485,\n\
\ \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.031067211262872485\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.031871875379197966,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.031871875379197966\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3219094247246022,\n\
\ \"mc1_stderr\": 0.0163555676119604,\n \"mc2\": 0.4688545512760683,\n\
\ \"mc2_stderr\": 0.015990152764591564\n }\n}\n```"
repo_url: https://huggingface.co/Secbone/llama-2-13B-instructed
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|arc:challenge|25_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hellaswag|10_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T18-39-18.624923.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T18-39-18.624923.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T18-39-18.624923.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T18-39-18.624923.parquet'
- config_name: results
data_files:
- split: 2023_09_13T18_39_18.624923
path:
- results_2023-09-13T18-39-18.624923.parquet
- split: latest
path:
- results_2023-09-13T18-39-18.624923.parquet
---
# Dataset Card for Evaluation run of Secbone/llama-2-13B-instructed
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Secbone/llama-2-13B-instructed
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Secbone/llama-2-13B-instructed](https://huggingface.co/Secbone/llama-2-13B-instructed) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Secbone__llama-2-13B-instructed",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T18:39:18.624923](https://huggingface.co/datasets/open-llm-leaderboard/details_Secbone__llama-2-13B-instructed/blob/main/results_2023-09-13T18-39-18.624923.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5572252571915645,
"acc_stderr": 0.03446062872001509,
"acc_norm": 0.5611308557600665,
"acc_norm_stderr": 0.03443910832643205,
"mc1": 0.3219094247246022,
"mc1_stderr": 0.0163555676119604,
"mc2": 0.4688545512760683,
"mc2_stderr": 0.015990152764591564
},
"harness|arc:challenge|25": {
"acc": 0.5503412969283277,
"acc_stderr": 0.014537144444284743,
"acc_norm": 0.5938566552901023,
"acc_norm_stderr": 0.01435165669009786
},
"harness|hellaswag|10": {
"acc": 0.6518621788488349,
"acc_stderr": 0.0047540638677001775,
"acc_norm": 0.8387771360286795,
"acc_norm_stderr": 0.0036698484004877773
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5723684210526315,
"acc_stderr": 0.04026097083296564,
"acc_norm": 0.5723684210526315,
"acc_norm_stderr": 0.04026097083296564
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.630188679245283,
"acc_stderr": 0.029711421880107933,
"acc_norm": 0.630188679245283,
"acc_norm_stderr": 0.029711421880107933
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467381,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467381
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278007,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278007
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.024552292209342658,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.024552292209342658
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.02686020644472434,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.02686020644472434
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.0331847733384533,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.0331847733384533
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.02951928261681724,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.02951928261681724
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5487179487179488,
"acc_stderr": 0.025230381238934833,
"acc_norm": 0.5487179487179488,
"acc_norm_stderr": 0.025230381238934833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7394495412844037,
"acc_stderr": 0.01881918203485007,
"acc_norm": 0.7394495412844037,
"acc_norm_stderr": 0.01881918203485007
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.03350991604696042,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.03350991604696042
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.02904133351059804,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.02904133351059804
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.042664163633521685,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.042664163633521685
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.036803503712864616,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.036803503712864616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326467,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326467
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.025819233256483717,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.025819233256483717
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7598978288633461,
"acc_stderr": 0.015274685213734191,
"acc_norm": 0.7598978288633461,
"acc_norm_stderr": 0.015274685213734191
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.025992472029306393,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.025992472029306393
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40670391061452515,
"acc_stderr": 0.016428811915898865,
"acc_norm": 0.40670391061452515,
"acc_norm_stderr": 0.016428811915898865
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.02840830202033269,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.02840830202033269
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.027098652621301754,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.027098652621301754
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.025773111169630443,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.025773111169630443
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4348109517601043,
"acc_stderr": 0.012661233805616302,
"acc_norm": 0.4348109517601043,
"acc_norm_stderr": 0.012661233805616302
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5514705882352942,
"acc_stderr": 0.030211479609121596,
"acc_norm": 0.5514705882352942,
"acc_norm_stderr": 0.030211479609121596
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.019910377463105932,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.019910377463105932
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6204081632653061,
"acc_stderr": 0.031067211262872485,
"acc_norm": 0.6204081632653061,
"acc_norm_stderr": 0.031067211262872485
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.031871875379197966,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.031871875379197966
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3219094247246022,
"mc1_stderr": 0.0163555676119604,
"mc2": 0.4688545512760683,
"mc2_stderr": 0.015990152764591564
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/totsuka_saika_yahariorenoseishunlovecomewamachigatteiru | 2023-09-17T17:36:11.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Totsuka Saika
This is the dataset of Totsuka Saika, containing 193 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 193 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 431 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 193 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 193 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 193 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 193 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 193 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 431 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 431 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 431 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/kasumi_nomura_asobiasobase | 2023-09-17T17:36:13.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Kasumi Nomura
This is the dataset of Kasumi Nomura, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 646 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 646 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 646 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 646 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/miura_yumiko_yahariorenoseishunlovecomewamachigatteiru | 2023-09-17T17:36:15.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Miura Yumiko
This is the dataset of Miura Yumiko, containing 191 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 191 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 450 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 191 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 191 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 191 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 191 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 191 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 450 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 450 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 450 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_speechlessai__speechless-codellama-34b-v1.0 | 2023-09-13T19:11:09.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of speechlessai/speechless-codellama-34b-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [speechlessai/speechless-codellama-34b-v1.0](https://huggingface.co/speechlessai/speechless-codellama-34b-v1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_speechlessai__speechless-codellama-34b-v1.0\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T19:09:51.319301](https://huggingface.co/datasets/open-llm-leaderboard/details_speechlessai__speechless-codellama-34b-v1.0/blob/main/results_2023-09-13T19-09-51.319301.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5342362830958945,\n\
\ \"acc_stderr\": 0.034949132938444705,\n \"acc_norm\": 0.538004053070666,\n\
\ \"acc_norm_stderr\": 0.03493872989072948,\n \"mc1\": 0.30354957160342716,\n\
\ \"mc1_stderr\": 0.016095884155386854,\n \"mc2\": 0.47135907975593017,\n\
\ \"mc2_stderr\": 0.014951001296424498\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49573378839590443,\n \"acc_stderr\": 0.014610858923956945,\n\
\ \"acc_norm\": 0.5247440273037542,\n \"acc_norm_stderr\": 0.014593487694937738\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.547998406691894,\n\
\ \"acc_stderr\": 0.004966736811010487,\n \"acc_norm\": 0.7412865962955587,\n\
\ \"acc_norm_stderr\": 0.004370328224831782\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.04060127035236395,\n\
\ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.04060127035236395\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4830188679245283,\n \"acc_stderr\": 0.030755120364119905,\n\
\ \"acc_norm\": 0.4830188679245283,\n \"acc_norm_stderr\": 0.030755120364119905\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04155319955593146,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04155319955593146\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n\
\ \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.43352601156069365,\n\
\ \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467381,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467381\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520203,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520203\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5806451612903226,\n\
\ \"acc_stderr\": 0.02807158890109185,\n \"acc_norm\": 0.5806451612903226,\n\
\ \"acc_norm_stderr\": 0.02807158890109185\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.0336612448905145,\n\
\ \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.0336612448905145\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6717171717171717,\n \"acc_stderr\": 0.033456784227567746,\n \"\
acc_norm\": 0.6717171717171717,\n \"acc_norm_stderr\": 0.033456784227567746\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7461139896373057,\n \"acc_stderr\": 0.03141024780565317,\n\
\ \"acc_norm\": 0.7461139896373057,\n \"acc_norm_stderr\": 0.03141024780565317\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.48717948717948717,\n \"acc_stderr\": 0.02534267129380725,\n\
\ \"acc_norm\": 0.48717948717948717,\n \"acc_norm_stderr\": 0.02534267129380725\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.02708037281514566,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.02708037281514566\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03242225027115006,\n \
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03242225027115006\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7211009174311926,\n \"acc_stderr\": 0.0192274688764635,\n \"acc_norm\"\
: 0.7211009174311926,\n \"acc_norm_stderr\": 0.0192274688764635\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4027777777777778,\n\
\ \"acc_stderr\": 0.033448873829978666,\n \"acc_norm\": 0.4027777777777778,\n\
\ \"acc_norm_stderr\": 0.033448873829978666\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03039153369274154\n \
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\"\
: 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036427,\n \"\
acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036427\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5190839694656488,\n \"acc_stderr\": 0.04382094705550988,\n\
\ \"acc_norm\": 0.5190839694656488,\n \"acc_norm_stderr\": 0.04382094705550988\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212094,\n \"\
acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n\
\ \"acc_stderr\": 0.045245960070300476,\n \"acc_norm\": 0.6759259259259259,\n\
\ \"acc_norm_stderr\": 0.045245960070300476\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.026853450377009154,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.026853450377009154\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.016857391247472552,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.016857391247472552\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5578034682080925,\n \"acc_stderr\": 0.0267386036438074,\n\
\ \"acc_norm\": 0.5578034682080925,\n \"acc_norm_stderr\": 0.0267386036438074\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27150837988826815,\n\
\ \"acc_stderr\": 0.014874252168095277,\n \"acc_norm\": 0.27150837988826815,\n\
\ \"acc_norm_stderr\": 0.014874252168095277\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.545751633986928,\n \"acc_stderr\": 0.02850980780262659,\n\
\ \"acc_norm\": 0.545751633986928,\n \"acc_norm_stderr\": 0.02850980780262659\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5916398713826366,\n\
\ \"acc_stderr\": 0.027917050748484617,\n \"acc_norm\": 0.5916398713826366,\n\
\ \"acc_norm_stderr\": 0.027917050748484617\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5987654320987654,\n \"acc_stderr\": 0.0272725828498398,\n\
\ \"acc_norm\": 0.5987654320987654,\n \"acc_norm_stderr\": 0.0272725828498398\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3900709219858156,\n \"acc_stderr\": 0.02909767559946393,\n \
\ \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.02909767559946393\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4198174706649283,\n\
\ \"acc_stderr\": 0.012604960816087371,\n \"acc_norm\": 0.4198174706649283,\n\
\ \"acc_norm_stderr\": 0.012604960816087371\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.45588235294117646,\n \"acc_stderr\": 0.030254372573976694,\n\
\ \"acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.030254372573976694\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5310457516339869,\n \"acc_stderr\": 0.020188804456361887,\n \
\ \"acc_norm\": 0.5310457516339869,\n \"acc_norm_stderr\": 0.020188804456361887\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n\
\ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916718,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916718\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n\
\ \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n\
\ \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6374269005847953,\n \"acc_stderr\": 0.0368713061556206,\n\
\ \"acc_norm\": 0.6374269005847953,\n \"acc_norm_stderr\": 0.0368713061556206\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30354957160342716,\n\
\ \"mc1_stderr\": 0.016095884155386854,\n \"mc2\": 0.47135907975593017,\n\
\ \"mc2_stderr\": 0.014951001296424498\n }\n}\n```"
repo_url: https://huggingface.co/speechlessai/speechless-codellama-34b-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|arc:challenge|25_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hellaswag|10_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T19-09-51.319301.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T19-09-51.319301.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T19-09-51.319301.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T19-09-51.319301.parquet'
- config_name: results
data_files:
- split: 2023_09_13T19_09_51.319301
path:
- results_2023-09-13T19-09-51.319301.parquet
- split: latest
path:
- results_2023-09-13T19-09-51.319301.parquet
---
# Dataset Card for Evaluation run of speechlessai/speechless-codellama-34b-v1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/speechlessai/speechless-codellama-34b-v1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [speechlessai/speechless-codellama-34b-v1.0](https://huggingface.co/speechlessai/speechless-codellama-34b-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_speechlessai__speechless-codellama-34b-v1.0",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T19:09:51.319301](https://huggingface.co/datasets/open-llm-leaderboard/details_speechlessai__speechless-codellama-34b-v1.0/blob/main/results_2023-09-13T19-09-51.319301.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5342362830958945,
"acc_stderr": 0.034949132938444705,
"acc_norm": 0.538004053070666,
"acc_norm_stderr": 0.03493872989072948,
"mc1": 0.30354957160342716,
"mc1_stderr": 0.016095884155386854,
"mc2": 0.47135907975593017,
"mc2_stderr": 0.014951001296424498
},
"harness|arc:challenge|25": {
"acc": 0.49573378839590443,
"acc_stderr": 0.014610858923956945,
"acc_norm": 0.5247440273037542,
"acc_norm_stderr": 0.014593487694937738
},
"harness|hellaswag|10": {
"acc": 0.547998406691894,
"acc_stderr": 0.004966736811010487,
"acc_norm": 0.7412865962955587,
"acc_norm_stderr": 0.004370328224831782
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.04060127035236395,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.04060127035236395
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4830188679245283,
"acc_stderr": 0.030755120364119905,
"acc_norm": 0.4830188679245283,
"acc_norm_stderr": 0.030755120364119905
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04155319955593146,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04155319955593146
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467381,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467381
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949625,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949625
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520203,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520203
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5806451612903226,
"acc_stderr": 0.02807158890109185,
"acc_norm": 0.5806451612903226,
"acc_norm_stderr": 0.02807158890109185
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.0336612448905145,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.0336612448905145
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6717171717171717,
"acc_stderr": 0.033456784227567746,
"acc_norm": 0.6717171717171717,
"acc_norm_stderr": 0.033456784227567746
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7461139896373057,
"acc_stderr": 0.03141024780565317,
"acc_norm": 0.7461139896373057,
"acc_norm_stderr": 0.03141024780565317
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.48717948717948717,
"acc_stderr": 0.02534267129380725,
"acc_norm": 0.48717948717948717,
"acc_norm_stderr": 0.02534267129380725
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.02708037281514566,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.02708037281514566
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03242225027115006,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03242225027115006
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7211009174311926,
"acc_stderr": 0.0192274688764635,
"acc_norm": 0.7211009174311926,
"acc_norm_stderr": 0.0192274688764635
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.033448873829978666,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.033448873829978666
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036427,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036427
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5190839694656488,
"acc_stderr": 0.04382094705550988,
"acc_norm": 0.5190839694656488,
"acc_norm_stderr": 0.04382094705550988
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.045245960070300476,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.045245960070300476
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.026853450377009154,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.026853450377009154
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.016857391247472552,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.016857391247472552
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5578034682080925,
"acc_stderr": 0.0267386036438074,
"acc_norm": 0.5578034682080925,
"acc_norm_stderr": 0.0267386036438074
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27150837988826815,
"acc_stderr": 0.014874252168095277,
"acc_norm": 0.27150837988826815,
"acc_norm_stderr": 0.014874252168095277
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.545751633986928,
"acc_stderr": 0.02850980780262659,
"acc_norm": 0.545751633986928,
"acc_norm_stderr": 0.02850980780262659
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5916398713826366,
"acc_stderr": 0.027917050748484617,
"acc_norm": 0.5916398713826366,
"acc_norm_stderr": 0.027917050748484617
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5987654320987654,
"acc_stderr": 0.0272725828498398,
"acc_norm": 0.5987654320987654,
"acc_norm_stderr": 0.0272725828498398
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3900709219858156,
"acc_stderr": 0.02909767559946393,
"acc_norm": 0.3900709219858156,
"acc_norm_stderr": 0.02909767559946393
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4198174706649283,
"acc_stderr": 0.012604960816087371,
"acc_norm": 0.4198174706649283,
"acc_norm_stderr": 0.012604960816087371
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.45588235294117646,
"acc_stderr": 0.030254372573976694,
"acc_norm": 0.45588235294117646,
"acc_norm_stderr": 0.030254372573976694
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5310457516339869,
"acc_stderr": 0.020188804456361887,
"acc_norm": 0.5310457516339869,
"acc_norm_stderr": 0.020188804456361887
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916718,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916718
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6374269005847953,
"acc_stderr": 0.0368713061556206,
"acc_norm": 0.6374269005847953,
"acc_norm_stderr": 0.0368713061556206
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30354957160342716,
"mc1_stderr": 0.016095884155386854,
"mc2": 0.47135907975593017,
"mc2_stderr": 0.014951001296424498
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/kawasaki_saki_yahariorenoseishunlovecomewamachigatteiru | 2023-09-17T17:36:17.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Kawasaki Saki
This is the dataset of Kawasaki Saki, containing 128 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 128 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 306 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 128 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 128 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 128 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 128 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 128 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 306 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 306 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 306 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
wdwfdw32/ehab | 2023-09-13T19:13:29.000Z | [
"region:us"
] | wdwfdw32 | null | null | null | 0 | 0 | Entry not found |
esantiago/processed_demo | 2023-09-13T19:14:18.000Z | [
"region:us"
] | esantiago | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: pokemon
dtype: int64
- name: type
dtype: int64
splits:
- name: train
num_bytes: 240
num_examples: 15
download_size: 1469
dataset_size: 240
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "processed_demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/honda_mio_idolmastercinderellagirls | 2023-09-17T17:36:20.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of honda_mio (THE iDOLM@STER: Cinderella Girls)
This is the dataset of honda_mio (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 504 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 504 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 504 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 504 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/tachibana_alice_idolmastercinderellagirls | 2023-09-17T17:36:22.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tachibana_alice (THE iDOLM@STER: Cinderella Girls)
This is the dataset of tachibana_alice (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 529 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 529 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 529 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 529 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_wenge-research__yayi-70b-llama2 | 2023-09-13T20:09:29.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of wenge-research/yayi-70b-llama2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wenge-research/yayi-70b-llama2](https://huggingface.co/wenge-research/yayi-70b-llama2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wenge-research__yayi-70b-llama2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T20:08:14.965059](https://huggingface.co/datasets/open-llm-leaderboard/details_wenge-research__yayi-70b-llama2/blob/main/results_2023-09-13T20-08-14.965059.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6427362614871128,\n\
\ \"acc_stderr\": 0.03251742836753478,\n \"acc_norm\": 0.6468766983428953,\n\
\ \"acc_norm_stderr\": 0.032494548846313066,\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.016132229728155045,\n \"mc2\": 0.4762734947955207,\n\
\ \"mc2_stderr\": 0.01439837288557781\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5614334470989761,\n \"acc_stderr\": 0.014500682618212862,\n\
\ \"acc_norm\": 0.606655290102389,\n \"acc_norm_stderr\": 0.014275101465693026\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.640211113324039,\n\
\ \"acc_stderr\": 0.0047895751634186535,\n \"acc_norm\": 0.8392750448117905,\n\
\ \"acc_norm_stderr\": 0.00366526456385775\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.69,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n\
\ \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.5606936416184971,\n\
\ \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6170212765957447,\n \"acc_stderr\": 0.03177821250236922,\n\
\ \"acc_norm\": 0.6170212765957447,\n \"acc_norm_stderr\": 0.03177821250236922\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.04644602091222318,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.04644602091222318\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.025467149045469536,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.025467149045469536\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.0315841532404771,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.0315841532404771\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289708,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289708\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.41721854304635764,\n \"acc_stderr\": 0.04026141497634612,\n \"\
acc_norm\": 0.41721854304635764,\n \"acc_norm_stderr\": 0.04026141497634612\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8128440366972477,\n \"acc_stderr\": 0.016722684526200154,\n \"\
acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.016722684526200154\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944853,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944853\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7354260089686099,\n\
\ \"acc_stderr\": 0.02960510321703832,\n \"acc_norm\": 0.7354260089686099,\n\
\ \"acc_norm_stderr\": 0.02960510321703832\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8677685950413223,\n \"acc_stderr\": 0.0309227883204458,\n \"acc_norm\"\
: 0.8677685950413223,\n \"acc_norm_stderr\": 0.0309227883204458\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.038935425188248475,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.038935425188248475\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.01374079725857982,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.01374079725857982\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2737430167597765,\n\
\ \"acc_stderr\": 0.014912413096372434,\n \"acc_norm\": 0.2737430167597765,\n\
\ \"acc_norm_stderr\": 0.014912413096372434\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7363344051446945,\n\
\ \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.7363344051446945,\n\
\ \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.0246596851859673,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.0246596851859673\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5177304964539007,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.5177304964539007,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5110821382007823,\n\
\ \"acc_stderr\": 0.012767098998525826,\n \"acc_norm\": 0.5110821382007823,\n\
\ \"acc_norm_stderr\": 0.012767098998525826\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5772058823529411,\n \"acc_stderr\": 0.030008562845003476,\n\
\ \"acc_norm\": 0.5772058823529411,\n \"acc_norm_stderr\": 0.030008562845003476\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083376,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083376\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7673469387755102,\n \"acc_stderr\": 0.027049257915896175,\n\
\ \"acc_norm\": 0.7673469387755102,\n \"acc_norm_stderr\": 0.027049257915896175\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.93,\n \"acc_stderr\": 0.025643239997624294,\n \
\ \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.025643239997624294\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070803,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070803\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.016132229728155045,\n \"mc2\": 0.4762734947955207,\n\
\ \"mc2_stderr\": 0.01439837288557781\n }\n}\n```"
repo_url: https://huggingface.co/wenge-research/yayi-70b-llama2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|arc:challenge|25_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hellaswag|10_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T20-08-14.965059.parquet'
- config_name: results
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- results_2023-09-13T20-08-14.965059.parquet
- split: latest
path:
- results_2023-09-13T20-08-14.965059.parquet
---
# Dataset Card for Evaluation run of wenge-research/yayi-70b-llama2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/wenge-research/yayi-70b-llama2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [wenge-research/yayi-70b-llama2](https://huggingface.co/wenge-research/yayi-70b-llama2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wenge-research__yayi-70b-llama2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T20:08:14.965059](https://huggingface.co/datasets/open-llm-leaderboard/details_wenge-research__yayi-70b-llama2/blob/main/results_2023-09-13T20-08-14.965059.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6427362614871128,
"acc_stderr": 0.03251742836753478,
"acc_norm": 0.6468766983428953,
"acc_norm_stderr": 0.032494548846313066,
"mc1": 0.30599755201958384,
"mc1_stderr": 0.016132229728155045,
"mc2": 0.4762734947955207,
"mc2_stderr": 0.01439837288557781
},
"harness|arc:challenge|25": {
"acc": 0.5614334470989761,
"acc_stderr": 0.014500682618212862,
"acc_norm": 0.606655290102389,
"acc_norm_stderr": 0.014275101465693026
},
"harness|hellaswag|10": {
"acc": 0.640211113324039,
"acc_stderr": 0.0047895751634186535,
"acc_norm": 0.8392750448117905,
"acc_norm_stderr": 0.00366526456385775
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6170212765957447,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.6170212765957447,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.04644602091222318,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.04644602091222318
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.025467149045469536,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.025467149045469536
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726854,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.0315841532404771,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.0315841532404771
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289708,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289708
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.41721854304635764,
"acc_stderr": 0.04026141497634612,
"acc_norm": 0.41721854304635764,
"acc_norm_stderr": 0.04026141497634612
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8128440366972477,
"acc_stderr": 0.016722684526200154,
"acc_norm": 0.8128440366972477,
"acc_norm_stderr": 0.016722684526200154
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944853,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944853
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7354260089686099,
"acc_stderr": 0.02960510321703832,
"acc_norm": 0.7354260089686099,
"acc_norm_stderr": 0.02960510321703832
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.035477710041594654,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.035477710041594654
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.0309227883204458,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.0309227883204458
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.038935425188248475,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.038935425188248475
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.01374079725857982,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.01374079725857982
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2737430167597765,
"acc_stderr": 0.014912413096372434,
"acc_norm": 0.2737430167597765,
"acc_norm_stderr": 0.014912413096372434
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7363344051446945,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.7363344051446945,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.0246596851859673,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.0246596851859673
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5177304964539007,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.5177304964539007,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5110821382007823,
"acc_stderr": 0.012767098998525826,
"acc_norm": 0.5110821382007823,
"acc_norm_stderr": 0.012767098998525826
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5772058823529411,
"acc_stderr": 0.030008562845003476,
"acc_norm": 0.5772058823529411,
"acc_norm_stderr": 0.030008562845003476
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083376,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083376
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7673469387755102,
"acc_stderr": 0.027049257915896175,
"acc_norm": 0.7673469387755102,
"acc_norm_stderr": 0.027049257915896175
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.025643239997624294,
"acc_norm": 0.93,
"acc_norm_stderr": 0.025643239997624294
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070803,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070803
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30599755201958384,
"mc1_stderr": 0.016132229728155045,
"mc2": 0.4762734947955207,
"mc2_stderr": 0.01439837288557781
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Nicolas-BZRD/French_Transcribed_Podcast | 2023-09-22T10:03:18.000Z | [
"task_categories:automatic-speech-recognition",
"size_categories:100K<n<1M",
"language:fr",
"license:unknown",
"Podcast",
"Audio",
"Transcribed",
"region:us"
] | Nicolas-BZRD | null | null | null | 0 | 0 | ---
language:
- fr
license: unknown
size_categories:
- 100K<n<1M
task_categories:
- automatic-speech-recognition
pretty_name: Transcribed French Podcast
tags:
- Podcast
- Audio
- Transcribed
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: programme_id
dtype: string
- name: programme_entry_date
dtype: string
- name: programme_rss_link
dtype: string
- name: podcast_title
dtype: string
- name: podcast_date
dtype: string
- name: podcast_duration
dtype: string
- name: audio_podcast_link
dtype: string
splits:
- name: train
num_bytes: 96627005
num_examples: 281759
download_size: 28777088
dataset_size: 96627005
---
# French Transcribed Podcast
### Dataset Summary
Dataset of 280,000 mp3 links to French podcasts. Transcription using [whisper](https://github.com/openai/whisper) is underway. However, due to the large number of podcasts, it will not be possible to transcribe all of them. We are therefore counting on the help of the community to help us finish this colossal task.
The total duration of the podcasts is estimated at approximately 2958 days (4259523 minutes). However, this value is only an indication, as some links no longer seem to work and not all podcasts have the indicated duration.
N.B. The podcast links are available on the French government's data gouv [website](https://www.data.gouv.fr/fr/datasets/podcasts-francais-archives-a-lina/). |
TamerlanW/Stadi | 2023-09-26T19:15:10.000Z | [
"region:us"
] | TamerlanW | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Weyaxi__Luban-Marcoroni-13B-v2 | 2023-09-13T20:56:04.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Weyaxi/Luban-Marcoroni-13B-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/Luban-Marcoroni-13B-v2](https://huggingface.co/Weyaxi/Luban-Marcoroni-13B-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Luban-Marcoroni-13B-v2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T20:54:44.969205](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Luban-Marcoroni-13B-v2/blob/main/results_2023-09-13T20-54-44.969205.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5883403273122713,\n\
\ \"acc_stderr\": 0.0340528210168368,\n \"acc_norm\": 0.5921503303898759,\n\
\ \"acc_norm_stderr\": 0.03403182192905857,\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.017115815632418194,\n \"mc2\": 0.5555918994652874,\n\
\ \"mc2_stderr\": 0.015731778754042403\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6117747440273038,\n \"acc_stderr\": 0.01424161420741404,\n\
\ \"acc_norm\": 0.6348122866894198,\n \"acc_norm_stderr\": 0.014070265519268802\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6271659032065325,\n\
\ \"acc_stderr\": 0.004825702533920413,\n \"acc_norm\": 0.828918542123083,\n\
\ \"acc_norm_stderr\": 0.0037581050431501244\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n\
\ \"acc_stderr\": 0.04309732901036356,\n \"acc_norm\": 0.5333333333333333,\n\
\ \"acc_norm_stderr\": 0.04309732901036356\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.032671518489247764,\n\
\ \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.032671518489247764\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36507936507936506,\n \"acc_stderr\": 0.02479606060269995,\n \"\
acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.02479606060269995\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.026069362295335137,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.026069362295335137\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.034991131376767445,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.034991131376767445\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932026,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932026\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.027171213683164552,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.027171213683164552\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5666666666666667,\n \"acc_stderr\": 0.025124653525885117,\n\
\ \"acc_norm\": 0.5666666666666667,\n \"acc_norm_stderr\": 0.025124653525885117\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.03201650100739611,\n\
\ \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.03201650100739611\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7596330275229358,\n \"acc_stderr\": 0.01832060732096407,\n \"\
acc_norm\": 0.7596330275229358,\n \"acc_norm_stderr\": 0.01832060732096407\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.02786594228663933,\n \"acc_norm\"\
: 0.803921568627451,\n \"acc_norm_stderr\": 0.02786594228663933\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159253,\n \"\
acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159253\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912073,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912073\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489277,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489277\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n\
\ \"acc_stderr\": 0.014836205167333562,\n \"acc_norm\": 0.7790549169859514,\n\
\ \"acc_norm_stderr\": 0.014836205167333562\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4212290502793296,\n\
\ \"acc_stderr\": 0.0165136760311796,\n \"acc_norm\": 0.4212290502793296,\n\
\ \"acc_norm_stderr\": 0.0165136760311796\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.027245613047215365,\n\
\ \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.027245613047215365\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\
\ \"acc_stderr\": 0.02698147804364804,\n \"acc_norm\": 0.6559485530546624,\n\
\ \"acc_norm_stderr\": 0.02698147804364804\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02622964917882116,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02622964917882116\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236844,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236844\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42503259452411996,\n\
\ \"acc_stderr\": 0.012625879884891996,\n \"acc_norm\": 0.42503259452411996,\n\
\ \"acc_norm_stderr\": 0.012625879884891996\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5669934640522876,\n \"acc_stderr\": 0.020045442473324224,\n \
\ \"acc_norm\": 0.5669934640522876,\n \"acc_norm_stderr\": 0.020045442473324224\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065674,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065674\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653693,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653693\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.017115815632418194,\n \"mc2\": 0.5555918994652874,\n\
\ \"mc2_stderr\": 0.015731778754042403\n }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/Luban-Marcoroni-13B-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|arc:challenge|25_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hellaswag|10_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T20-54-44.969205.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T20-54-44.969205.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T20-54-44.969205.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T20-54-44.969205.parquet'
- config_name: results
data_files:
- split: 2023_09_13T20_54_44.969205
path:
- results_2023-09-13T20-54-44.969205.parquet
- split: latest
path:
- results_2023-09-13T20-54-44.969205.parquet
---
# Dataset Card for Evaluation run of Weyaxi/Luban-Marcoroni-13B-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Weyaxi/Luban-Marcoroni-13B-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Weyaxi/Luban-Marcoroni-13B-v2](https://huggingface.co/Weyaxi/Luban-Marcoroni-13B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Luban-Marcoroni-13B-v2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T20:54:44.969205](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Luban-Marcoroni-13B-v2/blob/main/results_2023-09-13T20-54-44.969205.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5883403273122713,
"acc_stderr": 0.0340528210168368,
"acc_norm": 0.5921503303898759,
"acc_norm_stderr": 0.03403182192905857,
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418194,
"mc2": 0.5555918994652874,
"mc2_stderr": 0.015731778754042403
},
"harness|arc:challenge|25": {
"acc": 0.6117747440273038,
"acc_stderr": 0.01424161420741404,
"acc_norm": 0.6348122866894198,
"acc_norm_stderr": 0.014070265519268802
},
"harness|hellaswag|10": {
"acc": 0.6271659032065325,
"acc_stderr": 0.004825702533920413,
"acc_norm": 0.828918542123083,
"acc_norm_stderr": 0.0037581050431501244
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.04309732901036356,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.04309732901036356
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.02479606060269995,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.02479606060269995
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7,
"acc_stderr": 0.026069362295335137,
"acc_norm": 0.7,
"acc_norm_stderr": 0.026069362295335137
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.034991131376767445,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.034991131376767445
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932026,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932026
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.027171213683164552,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.027171213683164552
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5666666666666667,
"acc_stderr": 0.025124653525885117,
"acc_norm": 0.5666666666666667,
"acc_norm_stderr": 0.025124653525885117
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5840336134453782,
"acc_stderr": 0.03201650100739611,
"acc_norm": 0.5840336134453782,
"acc_norm_stderr": 0.03201650100739611
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7596330275229358,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.7596330275229358,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39351851851851855,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.39351851851851855,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.02786594228663933,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.02786594228663933
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159253,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159253
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912073,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912073
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489277,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489277
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7790549169859514,
"acc_stderr": 0.014836205167333562,
"acc_norm": 0.7790549169859514,
"acc_norm_stderr": 0.014836205167333562
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.02607431485165708,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.02607431485165708
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4212290502793296,
"acc_stderr": 0.0165136760311796,
"acc_norm": 0.4212290502793296,
"acc_norm_stderr": 0.0165136760311796
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.027245613047215365,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.027245613047215365
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.02698147804364804,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.02698147804364804
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02622964917882116,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02622964917882116
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236844,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236844
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42503259452411996,
"acc_stderr": 0.012625879884891996,
"acc_norm": 0.42503259452411996,
"acc_norm_stderr": 0.012625879884891996
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5625,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5669934640522876,
"acc_stderr": 0.020045442473324224,
"acc_norm": 0.5669934640522876,
"acc_norm_stderr": 0.020045442473324224
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065674,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065674
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653693,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653693
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418194,
"mc2": 0.5555918994652874,
"mc2_stderr": 0.015731778754042403
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/igarashi_kyouko_idolmastercinderellagirls | 2023-09-17T17:36:24.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of igarashi_kyouko (THE iDOLM@STER: Cinderella Girls)
This is the dataset of igarashi_kyouko (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 524 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 524 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 524 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 524 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/akagi_miria_idolmastercinderellagirls | 2023-09-17T17:36:26.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of akagi_miria (THE iDOLM@STER: Cinderella Girls)
This is the dataset of akagi_miria (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 531 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 531 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 531 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 531 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
DmatryMakeev/SadTalker | 2023-09-14T15:37:26.000Z | [
"region:us"
] | DmatryMakeev | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Weyaxi__Luban-Marcoroni-13B-v3 | 2023-09-13T22:13:42.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Weyaxi/Luban-Marcoroni-13B-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/Luban-Marcoroni-13B-v3](https://huggingface.co/Weyaxi/Luban-Marcoroni-13B-v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Luban-Marcoroni-13B-v3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T22:12:25.570871](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Luban-Marcoroni-13B-v3/blob/main/results_2023-09-13T22-12-25.570871.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.587560950339616,\n\
\ \"acc_stderr\": 0.03404542765781976,\n \"acc_norm\": 0.5913981890593393,\n\
\ \"acc_norm_stderr\": 0.034024181969002004,\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.017115815632418194,\n \"mc2\": 0.5556312122754024,\n\
\ \"mc2_stderr\": 0.01573330367079552\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6126279863481229,\n \"acc_stderr\": 0.01423587248790987,\n\
\ \"acc_norm\": 0.6373720136518771,\n \"acc_norm_stderr\": 0.014049106564955007\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6271659032065325,\n\
\ \"acc_stderr\": 0.004825702533920413,\n \"acc_norm\": 0.8288189603664609,\n\
\ \"acc_norm_stderr\": 0.003758972816627593\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.032671518489247764,\n\
\ \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.032671518489247764\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36243386243386244,\n \"acc_stderr\": 0.02475747390275206,\n \"\
acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.02475747390275206\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.026069362295335137,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.026069362295335137\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.034991131376767445,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.034991131376767445\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932026,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932026\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.027171213683164552,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.027171213683164552\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5666666666666667,\n \"acc_stderr\": 0.025124653525885117,\n\
\ \"acc_norm\": 0.5666666666666667,\n \"acc_norm_stderr\": 0.025124653525885117\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236152,\n\
\ \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236152\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7577981651376147,\n \"acc_stderr\": 0.01836817630659862,\n \"\
acc_norm\": 0.7577981651376147,\n \"acc_norm_stderr\": 0.01836817630659862\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.02786594228663933,\n \"acc_norm\"\
: 0.803921568627451,\n \"acc_norm_stderr\": 0.02786594228663933\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159253,\n \"\
acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159253\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912073,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912073\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489277,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489277\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n\
\ \"acc_stderr\": 0.014836205167333562,\n \"acc_norm\": 0.7790549169859514,\n\
\ \"acc_norm_stderr\": 0.014836205167333562\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6271676300578035,\n \"acc_stderr\": 0.026033890613576284,\n\
\ \"acc_norm\": 0.6271676300578035,\n \"acc_norm_stderr\": 0.026033890613576284\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4212290502793296,\n\
\ \"acc_stderr\": 0.0165136760311796,\n \"acc_norm\": 0.4212290502793296,\n\
\ \"acc_norm_stderr\": 0.0165136760311796\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.027245613047215365,\n\
\ \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.027245613047215365\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\
\ \"acc_stderr\": 0.02698147804364804,\n \"acc_norm\": 0.6559485530546624,\n\
\ \"acc_norm_stderr\": 0.02698147804364804\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02622964917882116,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02622964917882116\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.02968010556502904,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.02968010556502904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.424380704041721,\n\
\ \"acc_stderr\": 0.012623343757430018,\n \"acc_norm\": 0.424380704041721,\n\
\ \"acc_norm_stderr\": 0.012623343757430018\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5661764705882353,\n \"acc_stderr\": 0.03010563657001663,\n\
\ \"acc_norm\": 0.5661764705882353,\n \"acc_norm_stderr\": 0.03010563657001663\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.565359477124183,\n \"acc_stderr\": 0.02005426920072646,\n \
\ \"acc_norm\": 0.565359477124183,\n \"acc_norm_stderr\": 0.02005426920072646\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065674,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065674\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653693,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653693\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.0389136449583582,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.0389136449583582\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.017115815632418194,\n \"mc2\": 0.5556312122754024,\n\
\ \"mc2_stderr\": 0.01573330367079552\n }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/Luban-Marcoroni-13B-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|arc:challenge|25_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hellaswag|10_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T22-12-25.570871.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T22-12-25.570871.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T22-12-25.570871.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T22-12-25.570871.parquet'
- config_name: results
data_files:
- split: 2023_09_13T22_12_25.570871
path:
- results_2023-09-13T22-12-25.570871.parquet
- split: latest
path:
- results_2023-09-13T22-12-25.570871.parquet
---
# Dataset Card for Evaluation run of Weyaxi/Luban-Marcoroni-13B-v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Weyaxi/Luban-Marcoroni-13B-v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Weyaxi/Luban-Marcoroni-13B-v3](https://huggingface.co/Weyaxi/Luban-Marcoroni-13B-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Luban-Marcoroni-13B-v3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T22:12:25.570871](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Luban-Marcoroni-13B-v3/blob/main/results_2023-09-13T22-12-25.570871.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.587560950339616,
"acc_stderr": 0.03404542765781976,
"acc_norm": 0.5913981890593393,
"acc_norm_stderr": 0.034024181969002004,
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418194,
"mc2": 0.5556312122754024,
"mc2_stderr": 0.01573330367079552
},
"harness|arc:challenge|25": {
"acc": 0.6126279863481229,
"acc_stderr": 0.01423587248790987,
"acc_norm": 0.6373720136518771,
"acc_norm_stderr": 0.014049106564955007
},
"harness|hellaswag|10": {
"acc": 0.6271659032065325,
"acc_stderr": 0.004825702533920413,
"acc_norm": 0.8288189603664609,
"acc_norm_stderr": 0.003758972816627593
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36243386243386244,
"acc_stderr": 0.02475747390275206,
"acc_norm": 0.36243386243386244,
"acc_norm_stderr": 0.02475747390275206
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7,
"acc_stderr": 0.026069362295335137,
"acc_norm": 0.7,
"acc_norm_stderr": 0.026069362295335137
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.034991131376767445,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.034991131376767445
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932026,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932026
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.027171213683164552,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.027171213683164552
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5666666666666667,
"acc_stderr": 0.025124653525885117,
"acc_norm": 0.5666666666666667,
"acc_norm_stderr": 0.025124653525885117
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5798319327731093,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.5798319327731093,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7577981651376147,
"acc_stderr": 0.01836817630659862,
"acc_norm": 0.7577981651376147,
"acc_norm_stderr": 0.01836817630659862
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39351851851851855,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.39351851851851855,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.02786594228663933,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.02786594228663933
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159253,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159253
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912073,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912073
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489277,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489277
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7790549169859514,
"acc_stderr": 0.014836205167333562,
"acc_norm": 0.7790549169859514,
"acc_norm_stderr": 0.014836205167333562
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6271676300578035,
"acc_stderr": 0.026033890613576284,
"acc_norm": 0.6271676300578035,
"acc_norm_stderr": 0.026033890613576284
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4212290502793296,
"acc_stderr": 0.0165136760311796,
"acc_norm": 0.4212290502793296,
"acc_norm_stderr": 0.0165136760311796
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.027245613047215365,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.027245613047215365
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.02698147804364804,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.02698147804364804
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02622964917882116,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02622964917882116
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.02968010556502904,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.02968010556502904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.424380704041721,
"acc_stderr": 0.012623343757430018,
"acc_norm": 0.424380704041721,
"acc_norm_stderr": 0.012623343757430018
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5661764705882353,
"acc_stderr": 0.03010563657001663,
"acc_norm": 0.5661764705882353,
"acc_norm_stderr": 0.03010563657001663
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.565359477124183,
"acc_stderr": 0.02005426920072646,
"acc_norm": 0.565359477124183,
"acc_norm_stderr": 0.02005426920072646
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065674,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065674
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653693,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653693
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.0389136449583582,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.0389136449583582
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418194,
"mc2": 0.5556312122754024,
"mc2_stderr": 0.01573330367079552
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/otokura_yuuki_idolmastercinderellagirls | 2023-09-17T17:36:28.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of otokura_yuuki (THE iDOLM@STER: Cinderella Girls)
This is the dataset of otokura_yuuki (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 528 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 528 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 528 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 528 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/oikawa_shizuku_idolmastercinderellagirls | 2023-09-17T17:36:30.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of oikawa_shizuku (THE iDOLM@STER: Cinderella Girls)
This is the dataset of oikawa_shizuku (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 514 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 514 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 514 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 514 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_oh-yeontaek__llama-2-13B-LoRA-assemble | 2023-09-13T23:31:24.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of oh-yeontaek/llama-2-13B-LoRA-assemble
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [oh-yeontaek/llama-2-13B-LoRA-assemble](https://huggingface.co/oh-yeontaek/llama-2-13B-LoRA-assemble)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_oh-yeontaek__llama-2-13B-LoRA-assemble\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T23:30:08.066135](https://huggingface.co/datasets/open-llm-leaderboard/details_oh-yeontaek__llama-2-13B-LoRA-assemble/blob/main/results_2023-09-13T23-30-08.066135.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5989105794324632,\n\
\ \"acc_stderr\": 0.03386180683240171,\n \"acc_norm\": 0.6028757876508483,\n\
\ \"acc_norm_stderr\": 0.033838980731318,\n \"mc1\": 0.4039167686658507,\n\
\ \"mc1_stderr\": 0.017177276822584284,\n \"mc2\": 0.5595996097364623,\n\
\ \"mc2_stderr\": 0.015690304235652236\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6006825938566553,\n \"acc_stderr\": 0.014312094557946702,\n\
\ \"acc_norm\": 0.6356655290102389,\n \"acc_norm_stderr\": 0.014063260279882417\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6361282613025294,\n\
\ \"acc_stderr\": 0.004801290954387088,\n \"acc_norm\": 0.8350926110336586,\n\
\ \"acc_norm_stderr\": 0.0037033852685121747\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796005,\n\
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796005\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.04514496132873634,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.04514496132873634\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.34656084656084657,\n \"acc_stderr\": 0.024508777521028424,\n \"\
acc_norm\": 0.34656084656084657,\n \"acc_norm_stderr\": 0.024508777521028424\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n\
\ \"acc_stderr\": 0.026729499068349958,\n \"acc_norm\": 0.6709677419354839,\n\
\ \"acc_norm_stderr\": 0.026729499068349958\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.035014387062967806,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.035014387062967806\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.02460362692409742,\n \
\ \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.02460362692409742\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547307,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547307\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413926,\n \
\ \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413926\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7926605504587156,\n \"acc_stderr\": 0.01738141556360868,\n \"\
acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.01738141556360868\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.033723432716530645,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.033723432716530645\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.02615686752393104,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02615686752393104\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908706,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908706\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489294,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489294\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7956577266922095,\n\
\ \"acc_stderr\": 0.0144191239809319,\n \"acc_norm\": 0.7956577266922095,\n\
\ \"acc_norm_stderr\": 0.0144191239809319\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.661849710982659,\n \"acc_stderr\": 0.025469770149400172,\n\
\ \"acc_norm\": 0.661849710982659,\n \"acc_norm_stderr\": 0.025469770149400172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4849162011173184,\n\
\ \"acc_stderr\": 0.016714890379996062,\n \"acc_norm\": 0.4849162011173184,\n\
\ \"acc_norm_stderr\": 0.016714890379996062\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.026857294663281416,\n\
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.026857294663281416\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765137,\n\
\ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765137\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4595827900912647,\n\
\ \"acc_stderr\": 0.012728446067669956,\n \"acc_norm\": 0.4595827900912647,\n\
\ \"acc_norm_stderr\": 0.012728446067669956\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.0296246635811597,\n\
\ \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.0296246635811597\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5882352941176471,\n \"acc_stderr\": 0.019910377463105932,\n \
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.019910377463105932\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.03002105623844031,\n\
\ \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.03002105623844031\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n\
\ \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.7661691542288557,\n\
\ \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4039167686658507,\n\
\ \"mc1_stderr\": 0.017177276822584284,\n \"mc2\": 0.5595996097364623,\n\
\ \"mc2_stderr\": 0.015690304235652236\n }\n}\n```"
repo_url: https://huggingface.co/oh-yeontaek/llama-2-13B-LoRA-assemble
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|arc:challenge|25_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hellaswag|10_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T23-30-08.066135.parquet'
- config_name: results
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- results_2023-09-13T23-30-08.066135.parquet
- split: latest
path:
- results_2023-09-13T23-30-08.066135.parquet
---
# Dataset Card for Evaluation run of oh-yeontaek/llama-2-13B-LoRA-assemble
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/oh-yeontaek/llama-2-13B-LoRA-assemble
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [oh-yeontaek/llama-2-13B-LoRA-assemble](https://huggingface.co/oh-yeontaek/llama-2-13B-LoRA-assemble) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_oh-yeontaek__llama-2-13B-LoRA-assemble",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T23:30:08.066135](https://huggingface.co/datasets/open-llm-leaderboard/details_oh-yeontaek__llama-2-13B-LoRA-assemble/blob/main/results_2023-09-13T23-30-08.066135.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5989105794324632,
"acc_stderr": 0.03386180683240171,
"acc_norm": 0.6028757876508483,
"acc_norm_stderr": 0.033838980731318,
"mc1": 0.4039167686658507,
"mc1_stderr": 0.017177276822584284,
"mc2": 0.5595996097364623,
"mc2_stderr": 0.015690304235652236
},
"harness|arc:challenge|25": {
"acc": 0.6006825938566553,
"acc_stderr": 0.014312094557946702,
"acc_norm": 0.6356655290102389,
"acc_norm_stderr": 0.014063260279882417
},
"harness|hellaswag|10": {
"acc": 0.6361282613025294,
"acc_stderr": 0.004801290954387088,
"acc_norm": 0.8350926110336586,
"acc_norm_stderr": 0.0037033852685121747
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796005,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796005
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.04514496132873634,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.04514496132873634
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.34656084656084657,
"acc_stderr": 0.024508777521028424,
"acc_norm": 0.34656084656084657,
"acc_norm_stderr": 0.024508777521028424
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.026729499068349958,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.026729499068349958
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.035014387062967806,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.035014387062967806
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.02460362692409742,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.02460362692409742
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547307,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547307
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6008403361344538,
"acc_stderr": 0.03181110032413926,
"acc_norm": 0.6008403361344538,
"acc_norm_stderr": 0.03181110032413926
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7926605504587156,
"acc_stderr": 0.01738141556360868,
"acc_norm": 0.7926605504587156,
"acc_norm_stderr": 0.01738141556360868
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.033723432716530645,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.033723432716530645
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02615686752393104,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02615686752393104
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908706,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908706
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489294,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489294
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7956577266922095,
"acc_stderr": 0.0144191239809319,
"acc_norm": 0.7956577266922095,
"acc_norm_stderr": 0.0144191239809319
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.661849710982659,
"acc_stderr": 0.025469770149400172,
"acc_norm": 0.661849710982659,
"acc_norm_stderr": 0.025469770149400172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4849162011173184,
"acc_stderr": 0.016714890379996062,
"acc_norm": 0.4849162011173184,
"acc_norm_stderr": 0.016714890379996062
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.026857294663281416,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.026857294663281416
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765137,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765137
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4595827900912647,
"acc_stderr": 0.012728446067669956,
"acc_norm": 0.4595827900912647,
"acc_norm_stderr": 0.012728446067669956
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.0296246635811597,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.0296246635811597
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.019910377463105932,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.019910377463105932
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.673469387755102,
"acc_stderr": 0.03002105623844031,
"acc_norm": 0.673469387755102,
"acc_norm_stderr": 0.03002105623844031
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7661691542288557,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.7661691542288557,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4039167686658507,
"mc1_stderr": 0.017177276822584284,
"mc2": 0.5595996097364623,
"mc2_stderr": 0.015690304235652236
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mattrichmo/brand-logos | 2023-09-14T20:21:42.000Z | [
"size_categories:1K<n<10K",
"region:us"
] | mattrichmo | null | null | null | 0 | 0 | ---
size_categories:
- 1K<n<10K
---
An Ongoing SVG Collection of Many Multiples of Brand Logos
Object looks like this:
Shape:
```
├── brandName ""
├── brandWebsite ""
├── brandPresence[{
│ └── platform
│ └── url
│ └── username}]
├── brandLogo[{
│ └── fileName
│ └── svgPath
│ └── svgData
│ ├── meta
│ ├── width
│ ├── height
│ ├── viewbox
│ └── fill
│ └── svgRaw}]
└── brandColors[{
└── meta
├── primary
├── secondary
├── tertiary
├── quaternary
├── priority
└── setting
├── colorName
├── colorHex
├── colorRGB
├── colorCMYK
└── colorPantone}]
``` |
jsonfin17/financial_conversation_summary | 2023-09-14T00:07:35.000Z | [
"region:us"
] | jsonfin17 | null | null | null | 0 | 0 | ---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/datasetcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/datasets-cards
{}
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/mifune_miyu_idolmastercinderellagirls | 2023-09-17T17:36:32.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mifune_miyu (THE iDOLM@STER: Cinderella Girls)
This is the dataset of mifune_miyu (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 508 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 508 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 508 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 508 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
huangyt/FINETUNE3_TEST3 | 2023-09-14T00:48:06.000Z | [
"license:openrail",
"region:us"
] | huangyt | null | null | null | 0 | 0 | ---
license: openrail
---
|
CyberHarem/hino_akane_idolmastercinderellagirls | 2023-09-17T17:36:34.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hino_akane (THE iDOLM@STER: Cinderella Girls)
This is the dataset of hino_akane (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 551 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 551 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 551 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 551 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
goodfellowliu/DIV2K | 2023-09-14T02:00:54.000Z | [
"license:apache-2.0",
"region:us"
] | goodfellowliu | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
Namitoo/demo | 2023-09-14T10:54:56.000Z | [
"task_categories:summarization",
"size_categories:10M<n<100M",
"language:chi",
"not-for-all-audiences",
"art",
"audio",
"region:us"
] | Namitoo | null | null | null | 0 | 0 | ---
language: chi
size_categories:
- 10M<n<100M
task_categories:
- summarization
tags:
- not-for-all-audiences
- art
- audio
pretty_name: didi-rito
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/ichihara_nina_idolmastercinderellagirls | 2023-09-17T17:36:36.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ichihara_nina (THE iDOLM@STER: Cinderella Girls)
This is the dataset of ichihara_nina (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 546 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 546 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 546 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 546 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/takamori_aiko_idolmastercinderellagirls | 2023-09-17T17:36:38.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of takamori_aiko (THE iDOLM@STER: Cinderella Girls)
This is the dataset of takamori_aiko (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 533 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 533 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 533 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 533 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
ganeshjcs/hindi-headline-article-generation | 2023-10-10T03:05:44.000Z | [
"license:cc-by-sa-4.0",
"region:us"
] | ganeshjcs | null | null | null | 0 | 0 | ---
license: cc-by-sa-4.0
---
|
CyberHarem/satou_shin_idolmastercinderellagirls | 2023-09-17T17:36:40.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of satou_shin (THE iDOLM@STER: Cinderella Girls)
This is the dataset of satou_shin (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 536 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 536 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 536 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 536 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
zxx-silence/shiba-inu-fenda-dreambooth | 2023-09-14T03:12:55.000Z | [
"region:us"
] | zxx-silence | null | null | null | 0 | 0 | Entry not found |
jsonfin17/testing-financialGoals | 2023-09-14T03:20:50.000Z | [
"region:us"
] | jsonfin17 | null | null | null | 0 | 0 | Entry not found |
kuronomiki/gblk99 | 2023-09-14T04:02:29.000Z | [
"license:other",
"region:us"
] | kuronomiki | null | null | null | 0 | 0 | ---
license: other
---
|
CyberHarem/tada_riina_idolmastercinderellagirls | 2023-09-17T17:36:42.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tada_riina (THE iDOLM@STER: Cinderella Girls)
This is the dataset of tada_riina (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 528 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 528 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 528 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 528 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
KYKNZ/SDXL | 2023-09-14T04:17:40.000Z | [
"license:cc0-1.0",
"region:us"
] | KYKNZ | null | null | null | 0 | 0 | ---
license: cc0-1.0
---
|
hanho/test1 | 2023-09-14T04:34:58.000Z | [
"license:openrail",
"region:us"
] | hanho | null | null | null | 0 | 0 | ---
license: openrail
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: package_name
dtype: string
- name: review
dtype: string
- name: date
dtype: string
- name: star
dtype: int64
- name: version_id
dtype: int64
splits:
- name: train
num_bytes: 1508
num_examples: 5
- name: test
num_bytes: 956
num_examples: 5
download_size: 9453
dataset_size: 2464
---
|
CyberHarem/hisakawa_hayate_idolmastercinderellagirls | 2023-09-17T17:36:44.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hisakawa_hayate (THE iDOLM@STER: Cinderella Girls)
This is the dataset of hisakawa_hayate (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 522 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 522 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 522 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 522 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_NoIdeaLand__test-2048-1500ck | 2023-09-14T04:40:54.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of NoIdeaLand/test-2048-1500ck
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NoIdeaLand/test-2048-1500ck](https://huggingface.co/NoIdeaLand/test-2048-1500ck)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NoIdeaLand__test-2048-1500ck\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-14T04:39:40.489809](https://huggingface.co/datasets/open-llm-leaderboard/details_NoIdeaLand__test-2048-1500ck/blob/main/results_2023-09-14T04-39-40.489809.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26196111221791213,\n\
\ \"acc_stderr\": 0.03173586961427775,\n \"acc_norm\": 0.2653334325357461,\n\
\ \"acc_norm_stderr\": 0.03173833592722594,\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062137,\n \"mc2\": 0.4095943166947606,\n\
\ \"mc2_stderr\": 0.014642509125225842\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.33532423208191126,\n \"acc_stderr\": 0.013796182947785564,\n\
\ \"acc_norm\": 0.36689419795221845,\n \"acc_norm_stderr\": 0.014084133118104294\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.45817566221868156,\n\
\ \"acc_stderr\": 0.004972293764978723,\n \"acc_norm\": 0.6255725951005776,\n\
\ \"acc_norm_stderr\": 0.004829856058603573\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n\
\ \"acc_stderr\": 0.037125378336148665,\n \"acc_norm\": 0.24444444444444444,\n\
\ \"acc_norm_stderr\": 0.037125378336148665\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19078947368421054,\n \"acc_stderr\": 0.031975658210325,\n\
\ \"acc_norm\": 0.19078947368421054,\n \"acc_norm_stderr\": 0.031975658210325\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.35,\n\
\ \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24150943396226415,\n \"acc_stderr\": 0.026341480371118366,\n\
\ \"acc_norm\": 0.24150943396226415,\n \"acc_norm_stderr\": 0.026341480371118366\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2986111111111111,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.2986111111111111,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n\
\ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\
\ \"acc_stderr\": 0.03063114553919882,\n \"acc_norm\": 0.2023121387283237,\n\
\ \"acc_norm_stderr\": 0.03063114553919882\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.030472973363380042,\n\
\ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.030472973363380042\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2896551724137931,\n \"acc_stderr\": 0.03780019230438015,\n\
\ \"acc_norm\": 0.2896551724137931,\n \"acc_norm_stderr\": 0.03780019230438015\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2328042328042328,\n \"acc_stderr\": 0.021765961672154523,\n \"\
acc_norm\": 0.2328042328042328,\n \"acc_norm_stderr\": 0.021765961672154523\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.03764950879790606,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.03764950879790606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22258064516129034,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.22258064516129034,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733552,\n\
\ \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733552\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\"\
: 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3151515151515151,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.3151515151515151,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.19696969696969696,\n \"acc_stderr\": 0.02833560973246335,\n \"\
acc_norm\": 0.19696969696969696,\n \"acc_norm_stderr\": 0.02833560973246335\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.30569948186528495,\n \"acc_stderr\": 0.03324837939758159,\n\
\ \"acc_norm\": 0.30569948186528495,\n \"acc_norm_stderr\": 0.03324837939758159\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2692307692307692,\n \"acc_stderr\": 0.022489389793654824,\n\
\ \"acc_norm\": 0.2692307692307692,\n \"acc_norm_stderr\": 0.022489389793654824\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844086,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844086\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.02702543349888239,\n\
\ \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.02702543349888239\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2119205298013245,\n \"acc_stderr\": 0.03336767086567977,\n \"\
acc_norm\": 0.2119205298013245,\n \"acc_norm_stderr\": 0.03336767086567977\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22018348623853212,\n \"acc_stderr\": 0.01776597865232757,\n \"\
acc_norm\": 0.22018348623853212,\n \"acc_norm_stderr\": 0.01776597865232757\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2361111111111111,\n \"acc_stderr\": 0.028963702570791033,\n \"\
acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.028963702570791033\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.28431372549019607,\n \"acc_stderr\": 0.03166009679399812,\n \"\
acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.03166009679399812\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2869198312236287,\n \"acc_stderr\": 0.02944377302259469,\n \
\ \"acc_norm\": 0.2869198312236287,\n \"acc_norm_stderr\": 0.02944377302259469\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.39461883408071746,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.39461883408071746,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578728,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578728\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3076923076923077,\n\
\ \"acc_stderr\": 0.030236389942173095,\n \"acc_norm\": 0.3076923076923077,\n\
\ \"acc_norm_stderr\": 0.030236389942173095\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2835249042145594,\n\
\ \"acc_stderr\": 0.016117318166832265,\n \"acc_norm\": 0.2835249042145594,\n\
\ \"acc_norm_stderr\": 0.016117318166832265\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2745664739884393,\n \"acc_stderr\": 0.02402774515526501,\n\
\ \"acc_norm\": 0.2745664739884393,\n \"acc_norm_stderr\": 0.02402774515526501\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.025553169991826507,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.025553169991826507\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2347266881028939,\n\
\ \"acc_stderr\": 0.024071805887677045,\n \"acc_norm\": 0.2347266881028939,\n\
\ \"acc_norm_stderr\": 0.024071805887677045\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02601199293090202,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02601199293090202\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2627118644067797,\n\
\ \"acc_stderr\": 0.01124054551499566,\n \"acc_norm\": 0.2627118644067797,\n\
\ \"acc_norm_stderr\": 0.01124054551499566\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.21691176470588236,\n \"acc_stderr\": 0.025035845227711254,\n\
\ \"acc_norm\": 0.21691176470588236,\n \"acc_norm_stderr\": 0.025035845227711254\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.27450980392156865,\n \"acc_stderr\": 0.0180540274588152,\n \
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.0180540274588152\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n\
\ \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.2818181818181818,\n\
\ \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.1836734693877551,\n \"acc_stderr\": 0.024789071332007653,\n\
\ \"acc_norm\": 0.1836734693877551,\n \"acc_norm_stderr\": 0.024789071332007653\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.29518072289156627,\n\
\ \"acc_stderr\": 0.035509201856896294,\n \"acc_norm\": 0.29518072289156627,\n\
\ \"acc_norm_stderr\": 0.035509201856896294\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.03565079670708311,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.03565079670708311\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062137,\n \"mc2\": 0.4095943166947606,\n\
\ \"mc2_stderr\": 0.014642509125225842\n }\n}\n```"
repo_url: https://huggingface.co/NoIdeaLand/test-2048-1500ck
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|arc:challenge|25_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hellaswag|10_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T04-39-40.489809.parquet'
- config_name: results
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- results_2023-09-14T04-39-40.489809.parquet
- split: latest
path:
- results_2023-09-14T04-39-40.489809.parquet
---
# Dataset Card for Evaluation run of NoIdeaLand/test-2048-1500ck
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NoIdeaLand/test-2048-1500ck
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NoIdeaLand/test-2048-1500ck](https://huggingface.co/NoIdeaLand/test-2048-1500ck) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NoIdeaLand__test-2048-1500ck",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-14T04:39:40.489809](https://huggingface.co/datasets/open-llm-leaderboard/details_NoIdeaLand__test-2048-1500ck/blob/main/results_2023-09-14T04-39-40.489809.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26196111221791213,
"acc_stderr": 0.03173586961427775,
"acc_norm": 0.2653334325357461,
"acc_norm_stderr": 0.03173833592722594,
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062137,
"mc2": 0.4095943166947606,
"mc2_stderr": 0.014642509125225842
},
"harness|arc:challenge|25": {
"acc": 0.33532423208191126,
"acc_stderr": 0.013796182947785564,
"acc_norm": 0.36689419795221845,
"acc_norm_stderr": 0.014084133118104294
},
"harness|hellaswag|10": {
"acc": 0.45817566221868156,
"acc_stderr": 0.004972293764978723,
"acc_norm": 0.6255725951005776,
"acc_norm_stderr": 0.004829856058603573
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.037125378336148665,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.037125378336148665
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19078947368421054,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.19078947368421054,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24150943396226415,
"acc_stderr": 0.026341480371118366,
"acc_norm": 0.24150943396226415,
"acc_norm_stderr": 0.026341480371118366
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2986111111111111,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.2986111111111111,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.03063114553919882,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.03063114553919882
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.030472973363380042,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.030472973363380042
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2896551724137931,
"acc_stderr": 0.03780019230438015,
"acc_norm": 0.2896551724137931,
"acc_norm_stderr": 0.03780019230438015
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2328042328042328,
"acc_stderr": 0.021765961672154523,
"acc_norm": 0.2328042328042328,
"acc_norm_stderr": 0.021765961672154523
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790606,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22258064516129034,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.22258064516129034,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.029896114291733552,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.029896114291733552
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3151515151515151,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.3151515151515151,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.19696969696969696,
"acc_stderr": 0.02833560973246335,
"acc_norm": 0.19696969696969696,
"acc_norm_stderr": 0.02833560973246335
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.30569948186528495,
"acc_stderr": 0.03324837939758159,
"acc_norm": 0.30569948186528495,
"acc_norm_stderr": 0.03324837939758159
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2692307692307692,
"acc_stderr": 0.022489389793654824,
"acc_norm": 0.2692307692307692,
"acc_norm_stderr": 0.022489389793654824
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844086,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844086
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.02702543349888239,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.02702543349888239
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2119205298013245,
"acc_stderr": 0.03336767086567977,
"acc_norm": 0.2119205298013245,
"acc_norm_stderr": 0.03336767086567977
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22018348623853212,
"acc_stderr": 0.01776597865232757,
"acc_norm": 0.22018348623853212,
"acc_norm_stderr": 0.01776597865232757
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.028963702570791033,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.028963702570791033
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.03166009679399812,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.03166009679399812
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2869198312236287,
"acc_stderr": 0.02944377302259469,
"acc_norm": 0.2869198312236287,
"acc_norm_stderr": 0.02944377302259469
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.39461883408071746,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.39461883408071746,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578728,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578728
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3076923076923077,
"acc_stderr": 0.030236389942173095,
"acc_norm": 0.3076923076923077,
"acc_norm_stderr": 0.030236389942173095
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2835249042145594,
"acc_stderr": 0.016117318166832265,
"acc_norm": 0.2835249042145594,
"acc_norm_stderr": 0.016117318166832265
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2745664739884393,
"acc_stderr": 0.02402774515526501,
"acc_norm": 0.2745664739884393,
"acc_norm_stderr": 0.02402774515526501
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.025553169991826507,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.025553169991826507
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2347266881028939,
"acc_stderr": 0.024071805887677045,
"acc_norm": 0.2347266881028939,
"acc_norm_stderr": 0.024071805887677045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.02601199293090202,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.02601199293090202
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2627118644067797,
"acc_stderr": 0.01124054551499566,
"acc_norm": 0.2627118644067797,
"acc_norm_stderr": 0.01124054551499566
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.21691176470588236,
"acc_stderr": 0.025035845227711254,
"acc_norm": 0.21691176470588236,
"acc_norm_stderr": 0.025035845227711254
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.0180540274588152,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.0180540274588152
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2818181818181818,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.2818181818181818,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1836734693877551,
"acc_stderr": 0.024789071332007653,
"acc_norm": 0.1836734693877551,
"acc_norm_stderr": 0.024789071332007653
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-virology|5": {
"acc": 0.29518072289156627,
"acc_stderr": 0.035509201856896294,
"acc_norm": 0.29518072289156627,
"acc_norm_stderr": 0.035509201856896294
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.03565079670708311,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.03565079670708311
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062137,
"mc2": 0.4095943166947606,
"mc2_stderr": 0.014642509125225842
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Vath/bilek | 2023-09-14T05:35:00.000Z | [
"region:us"
] | Vath | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Faradaylab__ARIA-70B-V2 | 2023-09-14T05:15:21.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Faradaylab/ARIA-70B-V2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Faradaylab/ARIA-70B-V2](https://huggingface.co/Faradaylab/ARIA-70B-V2) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Faradaylab__ARIA-70B-V2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-14T05:14:04.383698](https://huggingface.co/datasets/open-llm-leaderboard/details_Faradaylab__ARIA-70B-V2/blob/main/results_2023-09-14T05-14-04.383698.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.634284380404607,\n\
\ \"acc_stderr\": 0.03297009687797655,\n \"acc_norm\": 0.638413500758921,\n\
\ \"acc_norm_stderr\": 0.03294467490940201,\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.4979524095981481,\n\
\ \"mc2_stderr\": 0.014785337524777346\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5784982935153583,\n \"acc_stderr\": 0.014430197069326025,\n\
\ \"acc_norm\": 0.621160409556314,\n \"acc_norm_stderr\": 0.014175915490000322\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6558454491137223,\n\
\ \"acc_stderr\": 0.004741208229092876,\n \"acc_norm\": 0.8568014339772954,\n\
\ \"acc_norm_stderr\": 0.0034955936625207357\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n\
\ \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6377358490566037,\n \"acc_stderr\": 0.029582245128384303,\n\
\ \"acc_norm\": 0.6377358490566037,\n \"acc_norm_stderr\": 0.029582245128384303\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404948,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404948\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6085106382978723,\n \"acc_stderr\": 0.03190701242326812,\n\
\ \"acc_norm\": 0.6085106382978723,\n \"acc_norm_stderr\": 0.03190701242326812\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.04630653203366595,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.04630653203366595\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851088,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851088\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n\
\ \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n\
\ \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603489,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603489\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.0291265228345868,\n \"acc_norm\"\
: 0.7878787878787878,\n \"acc_norm_stderr\": 0.0291265228345868\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465397,\n \
\ \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465397\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083018,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083018\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.423841059602649,\n \"acc_stderr\": 0.04034846678603397,\n \"acc_norm\"\
: 0.423841059602649,\n \"acc_norm_stderr\": 0.04034846678603397\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8513761467889909,\n\
\ \"acc_stderr\": 0.015251253773660834,\n \"acc_norm\": 0.8513761467889909,\n\
\ \"acc_norm_stderr\": 0.015251253773660834\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n\
\ \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801588,\n \"\
acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801588\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017012,\n \
\ \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017012\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7399103139013453,\n\
\ \"acc_stderr\": 0.029442495585857476,\n \"acc_norm\": 0.7399103139013453,\n\
\ \"acc_norm_stderr\": 0.029442495585857476\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847836,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847836\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.0345727283691767,\n \"acc_norm\"\
: 0.8264462809917356,\n \"acc_norm_stderr\": 0.0345727283691767\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03826076324884866,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03826076324884866\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903354,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903354\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917202,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917202\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33743016759776534,\n\
\ \"acc_stderr\": 0.015813901283913044,\n \"acc_norm\": 0.33743016759776534,\n\
\ \"acc_norm_stderr\": 0.015813901283913044\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.02664327847450875,\n\
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.02664327847450875\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n\
\ \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.485006518904824,\n\
\ \"acc_stderr\": 0.012764493202193257,\n \"acc_norm\": 0.485006518904824,\n\
\ \"acc_norm_stderr\": 0.012764493202193257\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5477941176470589,\n \"acc_stderr\": 0.030233758551596445,\n\
\ \"acc_norm\": 0.5477941176470589,\n \"acc_norm_stderr\": 0.030233758551596445\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.01890101532209308,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.01890101532209308\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174913,\n\
\ \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174913\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.023729830881018515,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.023729830881018515\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.038913644958358196,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.038913644958358196\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.4979524095981481,\n\
\ \"mc2_stderr\": 0.014785337524777346\n }\n}\n```"
repo_url: https://huggingface.co/Faradaylab/ARIA-70B-V2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|arc:challenge|25_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hellaswag|10_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T05-14-04.383698.parquet'
- config_name: results
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- results_2023-09-14T05-14-04.383698.parquet
- split: latest
path:
- results_2023-09-14T05-14-04.383698.parquet
---
# Dataset Card for Evaluation run of Faradaylab/ARIA-70B-V2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Faradaylab/ARIA-70B-V2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Faradaylab/ARIA-70B-V2](https://huggingface.co/Faradaylab/ARIA-70B-V2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Faradaylab__ARIA-70B-V2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-14T05:14:04.383698](https://huggingface.co/datasets/open-llm-leaderboard/details_Faradaylab__ARIA-70B-V2/blob/main/results_2023-09-14T05-14-04.383698.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.634284380404607,
"acc_stderr": 0.03297009687797655,
"acc_norm": 0.638413500758921,
"acc_norm_stderr": 0.03294467490940201,
"mc1": 0.3268053855569155,
"mc1_stderr": 0.01641987473113503,
"mc2": 0.4979524095981481,
"mc2_stderr": 0.014785337524777346
},
"harness|arc:challenge|25": {
"acc": 0.5784982935153583,
"acc_stderr": 0.014430197069326025,
"acc_norm": 0.621160409556314,
"acc_norm_stderr": 0.014175915490000322
},
"harness|hellaswag|10": {
"acc": 0.6558454491137223,
"acc_stderr": 0.004741208229092876,
"acc_norm": 0.8568014339772954,
"acc_norm_stderr": 0.0034955936625207357
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6377358490566037,
"acc_stderr": 0.029582245128384303,
"acc_norm": 0.6377358490566037,
"acc_norm_stderr": 0.029582245128384303
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404948,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404948
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6085106382978723,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.6085106382978723,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.04630653203366595,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.04630653203366595
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851088,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851088
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603489,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603489
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.0291265228345868,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.0291265228345868
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465397,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465397
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083018,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.423841059602649,
"acc_stderr": 0.04034846678603397,
"acc_norm": 0.423841059602649,
"acc_norm_stderr": 0.04034846678603397
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.024152225962801588,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.024152225962801588
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8565400843881856,
"acc_stderr": 0.022818291821017012,
"acc_norm": 0.8565400843881856,
"acc_norm_stderr": 0.022818291821017012
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7399103139013453,
"acc_stderr": 0.029442495585857476,
"acc_norm": 0.7399103139013453,
"acc_norm_stderr": 0.029442495585857476
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.0345727283691767,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.0345727283691767
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03826076324884866,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03826076324884866
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903354,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903354
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917202,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917202
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33743016759776534,
"acc_stderr": 0.015813901283913044,
"acc_norm": 0.33743016759776534,
"acc_norm_stderr": 0.015813901283913044
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.02664327847450875,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.02664327847450875
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.485006518904824,
"acc_stderr": 0.012764493202193257,
"acc_norm": 0.485006518904824,
"acc_norm_stderr": 0.012764493202193257
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5477941176470589,
"acc_stderr": 0.030233758551596445,
"acc_norm": 0.5477941176470589,
"acc_norm_stderr": 0.030233758551596445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.01890101532209308,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.01890101532209308
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174913,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174913
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018515,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018515
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.038913644958358196,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.038913644958358196
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3268053855569155,
"mc1_stderr": 0.01641987473113503,
"mc2": 0.4979524095981481,
"mc2_stderr": 0.014785337524777346
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
BangumiBase/watashinoyuriwaoshigotodesu | 2023-09-29T07:26:06.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Watashi No Yuri Wa Oshigoto Desu!
This is the image base of bangumi Watashi no Yuri wa Oshigoto Desu!, we detected 31 characters, 3255 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 221 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 10 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 15 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 12 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 12 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 10 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 23 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 14 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 26 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 22 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 416 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 142 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 31 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 5 | [Download](13/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 14 | 420 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 63 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 23 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 970 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 87 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 364 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 60 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 21 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 36 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 11 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 12 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 13 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 10 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 24 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 29 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 13 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 140 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
yuanmei424/fonts_ds | 2023-09-27T22:50:32.000Z | [
"region:us"
] | yuanmei424 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: edit_prompt
dtype: string
- name: input_image
dtype: image
- name: edited_image
dtype: image
splits:
- name: train
num_bytes: 83621453418.25
num_examples: 19837823
download_size: 0
dataset_size: 83621453418.25
---
# Dataset Card for "fonts_ds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/aiba_yumi_idolmastercinderellagirls | 2023-09-17T17:36:46.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of aiba_yumi (THE iDOLM@STER: Cinderella Girls)
This is the dataset of aiba_yumi (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 526 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 526 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 526 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 526 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
gangkongkong/koalpaca-llama2 | 2023-09-14T05:40:14.000Z | [
"license:apache-2.0",
"region:us"
] | gangkongkong | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
CyberHarem/senkawa_chihiro_idolmastercinderellagirls | 2023-09-17T17:36:48.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of senkawa_chihiro (THE iDOLM@STER: Cinderella Girls)
This is the dataset of senkawa_chihiro (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 530 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 530 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 530 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 530 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/natalia_idolmastercinderellagirls | 2023-09-17T17:36:50.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of natalia (THE iDOLM@STER: Cinderella Girls)
This is the dataset of natalia (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 549 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 549 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 549 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 549 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
BangumiBase/sakurasounopetnakanojo | 2023-09-29T07:39:39.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Sakurasou No Pet Na Kanojo
This is the image base of bangumi Sakurasou no Pet na Kanojo, we detected 24 characters, 4107 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 1328 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 405 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 313 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 33 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 18 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 46 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 47 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 74 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 580 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 105 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 43 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 523 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 43 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 71 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 11 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 21 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 139 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 13 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 9 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 28 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 9 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 20 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 9 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 219 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
acul3/mc4_und_id | 2023-09-14T08:52:19.000Z | [
"region:us"
] | acul3 | null | null | null | 0 | 0 | Entry not found |
CyberHarem/shirayuki_chiyo_idolmastercinderellagirls | 2023-09-17T17:36:52.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shirayuki_chiyo (THE iDOLM@STER: Cinderella Girls)
This is the dataset of shirayuki_chiyo (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 489 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 489 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 489 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 489 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
moylink/test20230914 | 2023-09-14T07:17:37.000Z | [
"license:openrail",
"region:us"
] | moylink | null | null | null | 0 | 0 | ---
license: openrail
---
|
innnky/nyaru | 2023-09-14T10:23:23.000Z | [
"region:us"
] | innnky | null | null | null | 0 | 0 | Entry not found |
lincroe/ga-ml | 2023-09-14T07:27:34.000Z | [
"region:us"
] | lincroe | null | null | null | 0 | 0 | Entry not found |
patrickvonplaten/testtest | 2023-09-14T07:29:35.000Z | [
"region:us"
] | patrickvonplaten | null | null | null | 0 | 0 | Entry not found |
0xk1h0/Py150k-vuln-scanned | 2023-09-14T07:43:19.000Z | [
"license:mit",
"region:us"
] | 0xk1h0 | null | null | null | 1 | 0 | ---
license: mit
---
|
pharaouk/corpus_1_clustered_2 | 2023-09-14T07:37:18.000Z | [
"region:us"
] | pharaouk | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_conversation_id
dtype: string
- name: embedding
sequence: float64
- name: text_processed
dtype: string
- name: __index_level_0__
dtype: int64
- name: cluster
sequence: int64
splits:
- name: train
num_bytes: 99791008
num_examples: 10000
download_size: 75705515
dataset_size: 99791008
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "corpus_1_clustered_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
raicrits/news_urls | 2023-09-14T07:43:51.000Z | [
"license:other",
"region:us"
] | raicrits | null | null | null | 0 | 0 | ---
license: other
---
A collection of about 21k urls of news articles taken from RAI news sites (national and regionals). The file ("urls_train_set.csv") contains around 20k of them reffering to articles published in
the period 01/01/2022 – 09/03/2023 while the file ("urls_test_set.csv") contains urls referring to articles published in the period 10/03/2023 - 04/05/2023.
|
HydraLM/corpus_1_clustered_2 | 2023-09-14T07:42:45.000Z | [
"region:us"
] | HydraLM | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_conversation_id
dtype: string
- name: embedding
sequence: float64
- name: text_processed
dtype: string
- name: __index_level_0__
dtype: int64
- name: cluster
sequence: int64
splits:
- name: train
num_bytes: 99791008
num_examples: 10000
download_size: 0
dataset_size: 99791008
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "corpus_1_clustered_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.