datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
EleutherAI/quirky_authors_alice | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: float64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 1367406.347070021
num_examples: 9719
- name: validation
num_bytes: 281716.5
num_examples: 2000
- name: test
num_bytes: 279750.5
num_examples: 2000
download_size: 883875
dataset_size: 1928873.347070021
---
# Dataset Card for "quirky_authors_alice"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
phyloforfun/HLT_MICH_Angiospermae_SLTPvA_v1-0_tiny__OCR-C35-L35-E100-R01 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 149304
num_examples: 87
download_size: 41919
dataset_size: 149304
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
richardr1126/spider-context-instruct | ---
language:
- en
license:
- cc-by-4.0
source_datasets:
- spider
pretty_name: Spider Context Instruct
tags:
- text-to-sql
- SQL
- Spider
- fine-tune
dataset_info:
features:
- name: db_id
dtype: string
- name: text
dtype: string
---
# Dataset Card for Spider Context Instruct
### Dataset Summary
Spider is a large-scale complex and cross-domain semantic parsing and text-to-SQL dataset annotated by 11 Yale students
The goal of the Spider challenge is to develop natural language interfaces to cross-domain databases.
This dataset was created to finetune LLMs in a `### Instruction:` and `### Response:` format with database context.
### Yale Lily Spider Leaderboards
The leaderboard can be seen at https://yale-lily.github.io/spider
### Languages
The text in the dataset is in English.
### Licensing Information
The spider dataset is licensed under
the [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/legalcode)
### Citation
```
@article{yu2018spider,
title={Spider: A large-scale human-labeled dataset for complex and cross-domain semantic parsing and text-to-sql task},
author={Yu, Tao and Zhang, Rui and Yang, Kai and Yasunaga, Michihiro and Wang, Dongxu and Li, Zifan and Ma, James and Li, Irene and Yao, Qingning and Roman, Shanelle and others},
journal={arXiv preprint arXiv:1809.08887},
year={2018}
}
``` |
Arham-Imran/cityscape_11_classes | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 6896961361.3
num_examples: 2975
- name: val
num_bytes: 1197986021.0
num_examples: 500
download_size: 8226983719
dataset_size: 8094947382.3
---
# Dataset Card for "cityscape_11_classes"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mursel/Turkish-wikipedia-50k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: title
dtype: string
- name: content
dtype: string
- name: raw
dtype: string
splits:
- name: train
num_bytes: 270894547.2961262
num_examples: 50000
download_size: 160212080
dataset_size: 270894547.2961262
---
# Dataset Card for "Turkish-wikipedia-50k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/augmentatio-standardized_cluster_7_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 13200216
num_examples: 7170
download_size: 5579319
dataset_size: 13200216
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "augmentatio-standardized_cluster_7_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Lonaz/test | ---
license: afl-3.0
---
|
nickrosh/Evol-Instruct-Code-80k-v1 | ---
license: cc-by-nc-sa-4.0
---
Open Source Implementation of Evol-Instruct-Code as described in the [WizardCoder Paper](https://arxiv.org/pdf/2306.08568.pdf).
Code for the intruction generation can be found on Github as [Evol-Teacher](https://github.com/nickrosh/evol-teacher).
|
Locutusque/hercules-v1.0 | ---
language:
- en
- code
size_categories:
- 100K<n<1M
task_categories:
- text-generation
- conversational
- question-answering
tags:
- biology
- math
- chemistry
- code
- not-for-all-audiences
---
# hercules-v1.0 dataset

The Hercules-v1.0 dataset is a turbo-charged version of teknium/openhermes, achieved by augmenting its data sources. Some of the datasets used in teknium/openhermes are older versions. Hercules-v1.0 addresses this issue by updating the data sources such as airoboros and WizardLM. Additionally, Hercules-v1.0 uses ise-uiuc/Magicoder-Evol-Instruct-110K instead of sahil2801/CodeAlpaca-20k as the primary code dataset.
Furthermore, I have removed the Unnatural Instructions dataset, as it may contain "outlier" examples.
The following is a list of data sources used to generate this dataset:
- GPTeacher by teknium
- ise-uiuc/Magicoder-Evol-Instruct-110K
- jondurbin/airoboros-3.2
- WizardLM/WizardLM_evol_instruct_V2_196k
- camel-ai/math
- camel-ai/chemistry
- camel-ai/physics
- camel-ai/biology
- teknium/GPT4-LLM-Cleaned
Just like the original openhermes, this dataset underwent cleaning to eliminate RLHF refusals. This removed approximately 50,000 examples from the dataset.
example count: 462,912
# disclaimer
This dataset contains jondurbin/airoboros-3.2, which is said to have toxic examples. As a result, you must acknowledge/agree to the following to use this data:
- a small sampling of the data contained within is "toxic"/"harmful", and contains profanity and other types of sensitive content
- none of the content or views contained in the dataset necessarily align with my personal beliefs or opinions, they are simply text generated by LLMs without a great amount of validation
- you are able to use the dataset lawfully, particularly in locations with less-than-free speech laws
- you, and you alone are responsible for having downloaded and used the dataset, and I am completely indemnified from any and all liabilities |
LalieRM/KNIGHT | ---
pretty_name: knight
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
krishan-CSE/Davidson_Hate_Speeh_Original | ---
license: apache-2.0
---
|
Sesgaro/PICIN_ASISTENT | ---
license: mit
---
|
CyberHarem/m82a1_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of m82a1/M82A1/M82A1 (Girls' Frontline)
This is the dataset of m82a1/M82A1/M82A1 (Girls' Frontline), containing 35 images and their tags.
The core tags of this character are `pink_hair, long_hair, breasts, pink_eyes, bangs, very_long_hair, medium_breasts, hair_between_eyes, hair_ornament, large_breasts, headgear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 35 | 63.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m82a1_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 35 | 31.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m82a1_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 73 | 59.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m82a1_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 35 | 55.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m82a1_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 73 | 93.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m82a1_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/m82a1_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 35 |  |  |  |  |  | 1girl, solo, cleavage, looking_at_viewer, fingerless_gloves, holding, sniper_rifle, black_thighhighs, bare_shoulders, collarbone, black_gloves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | cleavage | looking_at_viewer | fingerless_gloves | holding | sniper_rifle | black_thighhighs | bare_shoulders | collarbone | black_gloves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:--------------------|:--------------------|:----------|:---------------|:-------------------|:-----------------|:-------------|:---------------|
| 0 | 35 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X |
|
lilacai/lilac-hncomments-1m | ---
tags:
- Lilac
---
# lilac/hncomments-1m
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/OpenPipe/hacker-news](https://huggingface.co/datasets/OpenPipe/hacker-news)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-hncomments-1m
```
or from python with:
```py
ll.download("lilacai/lilac-hncomments-1m")
```
|
Felladrin/ChatML-Neural-DPO | ---
license: apache-2.0
language:
- en
size_categories:
- 1K<n<10K
---
[NeuralNovel/Neural-DPO](https://huggingface.co/datasets/NeuralNovel/Neural-DPO) in ChatML format, ready to use in [HuggingFace TRL's DPO Trainer](https://huggingface.co/docs/trl/main/en/dpo_trainer).
Python code used for conversion:
```python
from datasets import load_dataset
dataset = load_dataset("NeuralNovel/Neural-DPO", split="train")
def format(columns):
prompt = f"<|im_start|>user\n{columns['question']}<|im_end|>\n<|im_start|>assistant\n"
if (columns['system']):
prompt = f"<|im_start|>system\n{columns['system']}<|im_end|>\n{prompt}"
return {
"prompt": prompt,
"chosen": f"{columns['chosen']}<|im_end|>",
"rejected": f"{columns['rejected']}<|im_end|>",
}
dataset.map(format).select_columns(['prompt', 'chosen', 'rejected']).to_parquet("train.parquet")
```
|
deepghs/anime_head_detection | ---
license: mit
task_categories:
- object-detection
tags:
- art
size_categories:
- 10K<n<100K
---
Dataset for anime head detection (include the entire head, not only the face parts).
| Dataset | Train | Test | Validate | Description |
|------------------------|-------|------|----------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| ani_face_detection.v1i | 25698 | 113 | 253 | A high-quality third-party dataset (seems to no longer be publicly available, please contact me for removal if it infringes your rights) that can be used for training directly. Although its name includes `face`, but what it actually annotated are `head`. |
We provide an [online demo](https://huggingface.co/spaces/deepghs/anime_object_detection) here. |
open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.11 | ---
pretty_name: Evaluation run of SF-Foundation/Ein-72B-v0.11
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SF-Foundation/Ein-72B-v0.11](https://huggingface.co/SF-Foundation/Ein-72B-v0.11)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.11\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-11T13:40:58.813057](https://huggingface.co/datasets/open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.11/blob/main/results_2024-02-11T13-40-58.813057.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.772373168297044,\n\
\ \"acc_stderr\": 0.028022585208284104,\n \"acc_norm\": 0.7739457676486081,\n\
\ \"acc_norm_stderr\": 0.02857928542974863,\n \"mc1\": 0.6634026927784578,\n\
\ \"mc1_stderr\": 0.016542412809494887,\n \"mc2\": 0.790182015835219,\n\
\ \"mc2_stderr\": 0.013777445073321324\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7474402730375427,\n \"acc_stderr\": 0.012696728980207704,\n\
\ \"acc_norm\": 0.7679180887372014,\n \"acc_norm_stderr\": 0.012336718284948856\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7343158733320055,\n\
\ \"acc_stderr\": 0.004407941058874964,\n \"acc_norm\": 0.890161322445728,\n\
\ \"acc_norm_stderr\": 0.003120495238827559\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n\
\ \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.7185185185185186,\n\
\ \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.026293995855474928,\n\
\ \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.026293995855474928\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.82,\n\
\ \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \
\ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8377358490566038,\n \"acc_stderr\": 0.02269148287203535,\n\
\ \"acc_norm\": 0.8377358490566038,\n \"acc_norm_stderr\": 0.02269148287203535\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9375,\n\
\ \"acc_stderr\": 0.02024219611347799,\n \"acc_norm\": 0.9375,\n \
\ \"acc_norm_stderr\": 0.02024219611347799\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7514450867052023,\n\
\ \"acc_stderr\": 0.03295304696818317,\n \"acc_norm\": 0.7514450867052023,\n\
\ \"acc_norm_stderr\": 0.03295304696818317\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.03942772444036622,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.03942772444036622\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.026148818018424506,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.026148818018424506\n \
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.6052631578947368,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7793103448275862,\n \"acc_stderr\": 0.0345593020192481,\n\
\ \"acc_norm\": 0.7793103448275862,\n \"acc_norm_stderr\": 0.0345593020192481\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6825396825396826,\n \"acc_stderr\": 0.023973861998992072,\n \"\
acc_norm\": 0.6825396825396826,\n \"acc_norm_stderr\": 0.023973861998992072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5634920634920635,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.5634920634920635,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8870967741935484,\n \"acc_stderr\": 0.01800360332586361,\n \"\
acc_norm\": 0.8870967741935484,\n \"acc_norm_stderr\": 0.01800360332586361\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6600985221674877,\n \"acc_stderr\": 0.033327690684107895,\n \"\
acc_norm\": 0.6600985221674877,\n \"acc_norm_stderr\": 0.033327690684107895\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\"\
: 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865394,\n\
\ \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865394\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9393939393939394,\n \"acc_stderr\": 0.016999994927421592,\n \"\
acc_norm\": 0.9393939393939394,\n \"acc_norm_stderr\": 0.016999994927421592\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9844559585492227,\n \"acc_stderr\": 0.008927492715084315,\n\
\ \"acc_norm\": 0.9844559585492227,\n \"acc_norm_stderr\": 0.008927492715084315\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8051282051282052,\n \"acc_stderr\": 0.020083167595181393,\n\
\ \"acc_norm\": 0.8051282051282052,\n \"acc_norm_stderr\": 0.020083167595181393\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.45925925925925926,\n \"acc_stderr\": 0.030384169232350818,\n \
\ \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.030384169232350818\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8445378151260504,\n \"acc_stderr\": 0.023536818625398904,\n\
\ \"acc_norm\": 0.8445378151260504,\n \"acc_norm_stderr\": 0.023536818625398904\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5629139072847682,\n \"acc_stderr\": 0.040500357222306355,\n \"\
acc_norm\": 0.5629139072847682,\n \"acc_norm_stderr\": 0.040500357222306355\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9357798165137615,\n \"acc_stderr\": 0.0105104947132014,\n \"acc_norm\"\
: 0.9357798165137615,\n \"acc_norm_stderr\": 0.0105104947132014\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.0316746870682898,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.0316746870682898\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.9019607843137255,\n \"acc_stderr\": 0.020871118455552104,\n\
\ \"acc_norm\": 0.9019607843137255,\n \"acc_norm_stderr\": 0.020871118455552104\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9071729957805907,\n \"acc_stderr\": 0.018889750550956715,\n \
\ \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.018889750550956715\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7892376681614349,\n\
\ \"acc_stderr\": 0.02737309550054019,\n \"acc_norm\": 0.7892376681614349,\n\
\ \"acc_norm_stderr\": 0.02737309550054019\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n\
\ \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540616,\n \"\
acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540616\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n\
\ \"acc_stderr\": 0.03434300243630999,\n \"acc_norm\": 0.8518518518518519,\n\
\ \"acc_norm_stderr\": 0.03434300243630999\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8343558282208589,\n \"acc_stderr\": 0.029208296231259104,\n\
\ \"acc_norm\": 0.8343558282208589,\n \"acc_norm_stderr\": 0.029208296231259104\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n\
\ \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n\
\ \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.03393295729761011,\n\
\ \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.03393295729761011\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n\
\ \"acc_stderr\": 0.015006312806446914,\n \"acc_norm\": 0.9444444444444444,\n\
\ \"acc_norm_stderr\": 0.015006312806446914\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977725,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977725\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9157088122605364,\n\
\ \"acc_stderr\": 0.009934966499513791,\n \"acc_norm\": 0.9157088122605364,\n\
\ \"acc_norm_stderr\": 0.009934966499513791\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.838150289017341,\n \"acc_stderr\": 0.019829299214925416,\n\
\ \"acc_norm\": 0.838150289017341,\n \"acc_norm_stderr\": 0.019829299214925416\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6994413407821229,\n\
\ \"acc_stderr\": 0.01533456680625116,\n \"acc_norm\": 0.6994413407821229,\n\
\ \"acc_norm_stderr\": 0.01533456680625116\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8464052287581699,\n \"acc_stderr\": 0.02064559791041878,\n\
\ \"acc_norm\": 0.8464052287581699,\n \"acc_norm_stderr\": 0.02064559791041878\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8327974276527331,\n\
\ \"acc_stderr\": 0.021193872528034962,\n \"acc_norm\": 0.8327974276527331,\n\
\ \"acc_norm_stderr\": 0.021193872528034962\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8641975308641975,\n \"acc_stderr\": 0.019061588181505405,\n\
\ \"acc_norm\": 0.8641975308641975,\n \"acc_norm_stderr\": 0.019061588181505405\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6631205673758865,\n \"acc_stderr\": 0.02819553487396673,\n \
\ \"acc_norm\": 0.6631205673758865,\n \"acc_norm_stderr\": 0.02819553487396673\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6134289439374185,\n\
\ \"acc_stderr\": 0.012437288868088725,\n \"acc_norm\": 0.6134289439374185,\n\
\ \"acc_norm_stderr\": 0.012437288868088725\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02236867256288675,\n\
\ \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02236867256288675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8202614379084967,\n \"acc_stderr\": 0.01553374508338279,\n \
\ \"acc_norm\": 0.8202614379084967,\n \"acc_norm_stderr\": 0.01553374508338279\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7545454545454545,\n\
\ \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.7545454545454545,\n\
\ \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8326530612244898,\n \"acc_stderr\": 0.02389714476891452,\n\
\ \"acc_norm\": 0.8326530612244898,\n \"acc_norm_stderr\": 0.02389714476891452\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9054726368159204,\n\
\ \"acc_stderr\": 0.020687186951534094,\n \"acc_norm\": 0.9054726368159204,\n\
\ \"acc_norm_stderr\": 0.020687186951534094\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.93,\n \"acc_stderr\": 0.0256432399976243,\n \
\ \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.0256432399976243\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n\
\ \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n\
\ \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015578,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015578\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6634026927784578,\n\
\ \"mc1_stderr\": 0.016542412809494887,\n \"mc2\": 0.790182015835219,\n\
\ \"mc2_stderr\": 0.013777445073321324\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.840568271507498,\n \"acc_stderr\": 0.010288617479454764\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7877179681576952,\n \
\ \"acc_stderr\": 0.011263783355400313\n }\n}\n```"
repo_url: https://huggingface.co/SF-Foundation/Ein-72B-v0.11
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|arc:challenge|25_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|gsm8k|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hellaswag|10_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T13-40-58.813057.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T13-40-58.813057.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- '**/details_harness|winogrande|5_2024-02-11T13-40-58.813057.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-11T13-40-58.813057.parquet'
- config_name: results
data_files:
- split: 2024_02_11T13_40_58.813057
path:
- results_2024-02-11T13-40-58.813057.parquet
- split: latest
path:
- results_2024-02-11T13-40-58.813057.parquet
---
# Dataset Card for Evaluation run of SF-Foundation/Ein-72B-v0.11
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SF-Foundation/Ein-72B-v0.11](https://huggingface.co/SF-Foundation/Ein-72B-v0.11) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.11",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T13:40:58.813057](https://huggingface.co/datasets/open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.11/blob/main/results_2024-02-11T13-40-58.813057.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.772373168297044,
"acc_stderr": 0.028022585208284104,
"acc_norm": 0.7739457676486081,
"acc_norm_stderr": 0.02857928542974863,
"mc1": 0.6634026927784578,
"mc1_stderr": 0.016542412809494887,
"mc2": 0.790182015835219,
"mc2_stderr": 0.013777445073321324
},
"harness|arc:challenge|25": {
"acc": 0.7474402730375427,
"acc_stderr": 0.012696728980207704,
"acc_norm": 0.7679180887372014,
"acc_norm_stderr": 0.012336718284948856
},
"harness|hellaswag|10": {
"acc": 0.7343158733320055,
"acc_stderr": 0.004407941058874964,
"acc_norm": 0.890161322445728,
"acc_norm_stderr": 0.003120495238827559
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.038850042458002526,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.038850042458002526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.026293995855474928,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.026293995855474928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8377358490566038,
"acc_stderr": 0.02269148287203535,
"acc_norm": 0.8377358490566038,
"acc_norm_stderr": 0.02269148287203535
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9375,
"acc_stderr": 0.02024219611347799,
"acc_norm": 0.9375,
"acc_norm_stderr": 0.02024219611347799
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.03295304696818317,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.03295304696818317
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036622,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036622
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.8,
"acc_stderr": 0.026148818018424506,
"acc_norm": 0.8,
"acc_norm_stderr": 0.026148818018424506
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7793103448275862,
"acc_stderr": 0.0345593020192481,
"acc_norm": 0.7793103448275862,
"acc_norm_stderr": 0.0345593020192481
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6825396825396826,
"acc_stderr": 0.023973861998992072,
"acc_norm": 0.6825396825396826,
"acc_norm_stderr": 0.023973861998992072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5634920634920635,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.5634920634920635,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8870967741935484,
"acc_stderr": 0.01800360332586361,
"acc_norm": 0.8870967741935484,
"acc_norm_stderr": 0.01800360332586361
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6600985221674877,
"acc_stderr": 0.033327690684107895,
"acc_norm": 0.6600985221674877,
"acc_norm_stderr": 0.033327690684107895
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865394,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865394
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9393939393939394,
"acc_stderr": 0.016999994927421592,
"acc_norm": 0.9393939393939394,
"acc_norm_stderr": 0.016999994927421592
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9844559585492227,
"acc_stderr": 0.008927492715084315,
"acc_norm": 0.9844559585492227,
"acc_norm_stderr": 0.008927492715084315
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8051282051282052,
"acc_stderr": 0.020083167595181393,
"acc_norm": 0.8051282051282052,
"acc_norm_stderr": 0.020083167595181393
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.030384169232350818,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.030384169232350818
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8445378151260504,
"acc_stderr": 0.023536818625398904,
"acc_norm": 0.8445378151260504,
"acc_norm_stderr": 0.023536818625398904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5629139072847682,
"acc_stderr": 0.040500357222306355,
"acc_norm": 0.5629139072847682,
"acc_norm_stderr": 0.040500357222306355
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9357798165137615,
"acc_stderr": 0.0105104947132014,
"acc_norm": 0.9357798165137615,
"acc_norm_stderr": 0.0105104947132014
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.0316746870682898,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.0316746870682898
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9019607843137255,
"acc_stderr": 0.020871118455552104,
"acc_norm": 0.9019607843137255,
"acc_norm_stderr": 0.020871118455552104
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.018889750550956715,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.018889750550956715
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7892376681614349,
"acc_stderr": 0.02737309550054019,
"acc_norm": 0.7892376681614349,
"acc_norm_stderr": 0.02737309550054019
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540616,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540616
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243630999,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243630999
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8343558282208589,
"acc_stderr": 0.029208296231259104,
"acc_norm": 0.8343558282208589,
"acc_norm_stderr": 0.029208296231259104
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.03393295729761011,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.03393295729761011
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.015006312806446914,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.015006312806446914
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977725,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977725
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9157088122605364,
"acc_stderr": 0.009934966499513791,
"acc_norm": 0.9157088122605364,
"acc_norm_stderr": 0.009934966499513791
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.838150289017341,
"acc_stderr": 0.019829299214925416,
"acc_norm": 0.838150289017341,
"acc_norm_stderr": 0.019829299214925416
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6994413407821229,
"acc_stderr": 0.01533456680625116,
"acc_norm": 0.6994413407821229,
"acc_norm_stderr": 0.01533456680625116
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8464052287581699,
"acc_stderr": 0.02064559791041878,
"acc_norm": 0.8464052287581699,
"acc_norm_stderr": 0.02064559791041878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8327974276527331,
"acc_stderr": 0.021193872528034962,
"acc_norm": 0.8327974276527331,
"acc_norm_stderr": 0.021193872528034962
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8641975308641975,
"acc_stderr": 0.019061588181505405,
"acc_norm": 0.8641975308641975,
"acc_norm_stderr": 0.019061588181505405
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6631205673758865,
"acc_stderr": 0.02819553487396673,
"acc_norm": 0.6631205673758865,
"acc_norm_stderr": 0.02819553487396673
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6134289439374185,
"acc_stderr": 0.012437288868088725,
"acc_norm": 0.6134289439374185,
"acc_norm_stderr": 0.012437288868088725
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02236867256288675,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02236867256288675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8202614379084967,
"acc_stderr": 0.01553374508338279,
"acc_norm": 0.8202614379084967,
"acc_norm_stderr": 0.01553374508338279
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7545454545454545,
"acc_stderr": 0.041220665028782855,
"acc_norm": 0.7545454545454545,
"acc_norm_stderr": 0.041220665028782855
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8326530612244898,
"acc_stderr": 0.02389714476891452,
"acc_norm": 0.8326530612244898,
"acc_norm_stderr": 0.02389714476891452
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9054726368159204,
"acc_stderr": 0.020687186951534094,
"acc_norm": 0.9054726368159204,
"acc_norm_stderr": 0.020687186951534094
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.0256432399976243,
"acc_norm": 0.93,
"acc_norm_stderr": 0.0256432399976243
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015578,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015578
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6634026927784578,
"mc1_stderr": 0.016542412809494887,
"mc2": 0.790182015835219,
"mc2_stderr": 0.013777445073321324
},
"harness|winogrande|5": {
"acc": 0.840568271507498,
"acc_stderr": 0.010288617479454764
},
"harness|gsm8k|5": {
"acc": 0.7877179681576952,
"acc_stderr": 0.011263783355400313
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
woalsdnd/law.go.kr | ---
license: mit
task_categories:
- text-retrieval
- feature-extraction
language:
- ko
tags:
- legal
pretty_name: Legal case retrieval with Korean Precedents
size_categories:
- 10K<n<100K
---
<p align="center"><h1> Legal case retrieval with Korean Precedents (powered by https://law.go.kr/)</h1></p>
This dataset repository maintains files required for legal case retrieval using Korean Precedents acquired from https://law.go.kr/
For codes and more information, refer to **[GitHub page](https://github.com/jaeminSon/law.go.kr-cases/tree/main)**
|
enoahjr/twitter_dataset_1713187680 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 799049
num_examples: 2382
download_size: 421174
dataset_size: 799049
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gr3yshadow/vizzdata | ---
license: mit
---
|
zhangxuri/test | ---
task_categories:
- text-classification
language:
- aa
tags:
- chemistry
- zhangxu7ri
size_categories:
- 1K<n<10K
- n<1K
---
这是个测试数据集 |
jholst/public-dataset | ---
license: apache-2.0
language:
- en
--- |
CyberHarem/saika_magoichi_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of saika_magoichi/雑賀孫一/杂贺孙一 (Fate/Grand Order)
This is the dataset of saika_magoichi/雑賀孫一/杂贺孙一 (Fate/Grand Order), containing 37 images and their tags.
The core tags of this character are `long_hair, hair_ornament, feather_hair_ornament, hair_between_eyes, blue_eyes, white_hair, grey_hair, crossed_bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 37 | 65.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saika_magoichi_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 37 | 54.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saika_magoichi_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 80 | 102.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saika_magoichi_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/saika_magoichi_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, eating, holding_food, feathers, fingerless_gloves, onigiri, solo, black_gloves, looking_at_viewer, blush, rice_on_face, simple_background, upper_body, white_background |
| 1 | 7 |  |  |  |  |  | 1girl, solo, cape, feathers, looking_at_viewer, black_gloves, holding_gun, fingerless_gloves, upper_body, aiming_at_viewer, simple_background, white_background |
| 2 | 8 |  |  |  |  |  | 1girl, feathers, simple_background, solo, looking_at_viewer, upper_body, white_background, closed_mouth, red_cape, red_cloak, high_collar, bright_pupils |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | eating | holding_food | feathers | fingerless_gloves | onigiri | solo | black_gloves | looking_at_viewer | blush | rice_on_face | simple_background | upper_body | white_background | cape | holding_gun | aiming_at_viewer | closed_mouth | red_cape | red_cloak | high_collar | bright_pupils |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:---------------|:-----------|:--------------------|:----------|:-------|:---------------|:--------------------|:--------|:---------------|:--------------------|:-------------|:-------------------|:-------|:--------------|:-------------------|:---------------|:-----------|:------------|:--------------|:----------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | | | X | X | | X | X | X | | | X | X | X | X | X | X | | | | | |
| 2 | 8 |  |  |  |  |  | X | | | X | | | X | | X | | | X | X | X | | | | X | X | X | X | X |
|
Rane2021/Med_train | ---
license: mit
---
|
AryanNsc/Mainspacehubdata | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 10911
num_examples: 39
download_size: 8319
dataset_size: 10911
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Mainspacehubdata"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nz/losest_to_3000_range_1000_to_9000 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 3799087.274617199
num_examples: 10000
- name: test
num_bytes: 379908.7274617199
num_examples: 1000
download_size: 2174332
dataset_size: 4178996.0020789187
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Trickshotblaster/my_awesome_dataset | ---
license: mit
---
|
ramgus/albumcoversongtitle | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 118136535.258
num_examples: 1181
download_size: 92338136
dataset_size: 118136535.258
---
# Dataset Card for "albumcoversongtitle"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zjguoHF/processed_wikitext103_validation_dataset | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: validation
num_bytes: 1522176
num_examples: 3760
download_size: 542292
dataset_size: 1522176
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
heliosprime/twitter_dataset_1712954546 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 11859
num_examples: 28
download_size: 10201
dataset_size: 11859
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712954546"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_rishiraj__smol-3b | ---
pretty_name: Evaluation run of rishiraj/smol-3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [rishiraj/smol-3b](https://huggingface.co/rishiraj/smol-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rishiraj__smol-3b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-13T10:26:39.414520](https://huggingface.co/datasets/open-llm-leaderboard/details_rishiraj__smol-3b/blob/main/results_2023-12-13T10-26-39.414520.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4630403817848583,\n\
\ \"acc_stderr\": 0.03471311735087895,\n \"acc_norm\": 0.467024043140488,\n\
\ \"acc_norm_stderr\": 0.035458188509216476,\n \"mc1\": 0.32313341493268055,\n\
\ \"mc1_stderr\": 0.0163718362864546,\n \"mc2\": 0.5073211090135596,\n\
\ \"mc2_stderr\": 0.015470937650792245\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.42662116040955633,\n \"acc_stderr\": 0.014453185592920293,\n\
\ \"acc_norm\": 0.46331058020477817,\n \"acc_norm_stderr\": 0.014572000527756998\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.510157339175463,\n\
\ \"acc_stderr\": 0.004988751698341138,\n \"acc_norm\": 0.6823341963752241,\n\
\ \"acc_norm_stderr\": 0.004646172373100999\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5245283018867924,\n \"acc_stderr\": 0.030735822206205608,\n\
\ \"acc_norm\": 0.5245283018867924,\n \"acc_norm_stderr\": 0.030735822206205608\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.5347222222222222,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4161849710982659,\n\
\ \"acc_stderr\": 0.03758517775404948,\n \"acc_norm\": 0.4161849710982659,\n\
\ \"acc_norm_stderr\": 0.03758517775404948\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3446808510638298,\n \"acc_stderr\": 0.031068985963122145,\n\
\ \"acc_norm\": 0.3446808510638298,\n \"acc_norm_stderr\": 0.031068985963122145\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2804232804232804,\n \"acc_stderr\": 0.023135287974325625,\n \"\
acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.023135287974325625\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.04190596438871136,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.04190596438871136\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.532258064516129,\n\
\ \"acc_stderr\": 0.02838474778881333,\n \"acc_norm\": 0.532258064516129,\n\
\ \"acc_norm_stderr\": 0.02838474778881333\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561952,\n\
\ \"acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561952\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.03815494308688931,\n\
\ \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.03815494308688931\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5303030303030303,\n \"acc_stderr\": 0.03555804051763929,\n \"\
acc_norm\": 0.5303030303030303,\n \"acc_norm_stderr\": 0.03555804051763929\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6373056994818653,\n \"acc_stderr\": 0.03469713791704371,\n\
\ \"acc_norm\": 0.6373056994818653,\n \"acc_norm_stderr\": 0.03469713791704371\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4282051282051282,\n \"acc_stderr\": 0.025088301454694827,\n\
\ \"acc_norm\": 0.4282051282051282,\n \"acc_norm_stderr\": 0.025088301454694827\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2222222222222222,\n \"acc_stderr\": 0.025348097468097845,\n \
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.025348097468097845\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4327731092436975,\n \"acc_stderr\": 0.03218358107742613,\n \
\ \"acc_norm\": 0.4327731092436975,\n \"acc_norm_stderr\": 0.03218358107742613\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.618348623853211,\n \"acc_stderr\": 0.020828148517022596,\n \"\
acc_norm\": 0.618348623853211,\n \"acc_norm_stderr\": 0.020828148517022596\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3287037037037037,\n \"acc_stderr\": 0.03203614084670058,\n \"\
acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.03203614084670058\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5931372549019608,\n \"acc_stderr\": 0.03447891136353382,\n \"\
acc_norm\": 0.5931372549019608,\n \"acc_norm_stderr\": 0.03447891136353382\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6244725738396625,\n \"acc_stderr\": 0.03152256243091156,\n \
\ \"acc_norm\": 0.6244725738396625,\n \"acc_norm_stderr\": 0.03152256243091156\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4663677130044843,\n\
\ \"acc_stderr\": 0.033481800170603065,\n \"acc_norm\": 0.4663677130044843,\n\
\ \"acc_norm_stderr\": 0.033481800170603065\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.043749285605997376,\n\
\ \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.043749285605997376\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6033057851239669,\n \"acc_stderr\": 0.04465869780531009,\n \"\
acc_norm\": 0.6033057851239669,\n \"acc_norm_stderr\": 0.04465869780531009\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5705521472392638,\n \"acc_stderr\": 0.03889066619112723,\n\
\ \"acc_norm\": 0.5705521472392638,\n \"acc_norm_stderr\": 0.03889066619112723\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6965811965811965,\n\
\ \"acc_stderr\": 0.03011821010694265,\n \"acc_norm\": 0.6965811965811965,\n\
\ \"acc_norm_stderr\": 0.03011821010694265\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5351213282247765,\n\
\ \"acc_stderr\": 0.017835798806290642,\n \"acc_norm\": 0.5351213282247765,\n\
\ \"acc_norm_stderr\": 0.017835798806290642\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.49710982658959535,\n \"acc_stderr\": 0.02691864538323901,\n\
\ \"acc_norm\": 0.49710982658959535,\n \"acc_norm_stderr\": 0.02691864538323901\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217892,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217892\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.0285803410651383,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.0285803410651383\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5080385852090032,\n\
\ \"acc_stderr\": 0.028394421370984538,\n \"acc_norm\": 0.5080385852090032,\n\
\ \"acc_norm_stderr\": 0.028394421370984538\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.027777777777777797,\n\
\ \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.027777777777777797\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.31560283687943264,\n \"acc_stderr\": 0.027724989449509317,\n \
\ \"acc_norm\": 0.31560283687943264,\n \"acc_norm_stderr\": 0.027724989449509317\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.37353324641460234,\n\
\ \"acc_stderr\": 0.012354994823515267,\n \"acc_norm\": 0.37353324641460234,\n\
\ \"acc_norm_stderr\": 0.012354994823515267\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3639705882352941,\n \"acc_stderr\": 0.029227192460032025,\n\
\ \"acc_norm\": 0.3639705882352941,\n \"acc_norm_stderr\": 0.029227192460032025\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4444444444444444,\n \"acc_stderr\": 0.02010258389588718,\n \
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.02010258389588718\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5673469387755102,\n \"acc_stderr\": 0.03171752824062664,\n\
\ \"acc_norm\": 0.5673469387755102,\n \"acc_norm_stderr\": 0.03171752824062664\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6218905472636815,\n\
\ \"acc_stderr\": 0.03428867848778658,\n \"acc_norm\": 0.6218905472636815,\n\
\ \"acc_norm_stderr\": 0.03428867848778658\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.03811079669833531,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03811079669833531\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32313341493268055,\n\
\ \"mc1_stderr\": 0.0163718362864546,\n \"mc2\": 0.5073211090135596,\n\
\ \"mc2_stderr\": 0.015470937650792245\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6535122336227308,\n \"acc_stderr\": 0.013373773411685642\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.24639878695981804,\n \
\ \"acc_stderr\": 0.011869498557755346\n }\n}\n```"
repo_url: https://huggingface.co/rishiraj/smol-3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|arc:challenge|25_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|gsm8k|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hellaswag|10_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T10-26-39.414520.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T10-26-39.414520.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- '**/details_harness|winogrande|5_2023-12-13T10-26-39.414520.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-13T10-26-39.414520.parquet'
- config_name: results
data_files:
- split: 2023_12_13T10_26_39.414520
path:
- results_2023-12-13T10-26-39.414520.parquet
- split: latest
path:
- results_2023-12-13T10-26-39.414520.parquet
---
# Dataset Card for Evaluation run of rishiraj/smol-3b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [rishiraj/smol-3b](https://huggingface.co/rishiraj/smol-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rishiraj__smol-3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T10:26:39.414520](https://huggingface.co/datasets/open-llm-leaderboard/details_rishiraj__smol-3b/blob/main/results_2023-12-13T10-26-39.414520.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4630403817848583,
"acc_stderr": 0.03471311735087895,
"acc_norm": 0.467024043140488,
"acc_norm_stderr": 0.035458188509216476,
"mc1": 0.32313341493268055,
"mc1_stderr": 0.0163718362864546,
"mc2": 0.5073211090135596,
"mc2_stderr": 0.015470937650792245
},
"harness|arc:challenge|25": {
"acc": 0.42662116040955633,
"acc_stderr": 0.014453185592920293,
"acc_norm": 0.46331058020477817,
"acc_norm_stderr": 0.014572000527756998
},
"harness|hellaswag|10": {
"acc": 0.510157339175463,
"acc_stderr": 0.004988751698341138,
"acc_norm": 0.6823341963752241,
"acc_norm_stderr": 0.004646172373100999
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5245283018867924,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.5245283018867924,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5347222222222222,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.5347222222222222,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4161849710982659,
"acc_stderr": 0.03758517775404948,
"acc_norm": 0.4161849710982659,
"acc_norm_stderr": 0.03758517775404948
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3446808510638298,
"acc_stderr": 0.031068985963122145,
"acc_norm": 0.3446808510638298,
"acc_norm_stderr": 0.031068985963122145
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.023135287974325625,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.023135287974325625
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871136,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871136
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.532258064516129,
"acc_stderr": 0.02838474778881333,
"acc_norm": 0.532258064516129,
"acc_norm_stderr": 0.02838474778881333
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.37438423645320196,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.37438423645320196,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.03815494308688931,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.03815494308688931
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5303030303030303,
"acc_stderr": 0.03555804051763929,
"acc_norm": 0.5303030303030303,
"acc_norm_stderr": 0.03555804051763929
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6373056994818653,
"acc_stderr": 0.03469713791704371,
"acc_norm": 0.6373056994818653,
"acc_norm_stderr": 0.03469713791704371
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4282051282051282,
"acc_stderr": 0.025088301454694827,
"acc_norm": 0.4282051282051282,
"acc_norm_stderr": 0.025088301454694827
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.025348097468097845,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.025348097468097845
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4327731092436975,
"acc_stderr": 0.03218358107742613,
"acc_norm": 0.4327731092436975,
"acc_norm_stderr": 0.03218358107742613
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.618348623853211,
"acc_stderr": 0.020828148517022596,
"acc_norm": 0.618348623853211,
"acc_norm_stderr": 0.020828148517022596
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3287037037037037,
"acc_stderr": 0.03203614084670058,
"acc_norm": 0.3287037037037037,
"acc_norm_stderr": 0.03203614084670058
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5931372549019608,
"acc_stderr": 0.03447891136353382,
"acc_norm": 0.5931372549019608,
"acc_norm_stderr": 0.03447891136353382
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6244725738396625,
"acc_stderr": 0.03152256243091156,
"acc_norm": 0.6244725738396625,
"acc_norm_stderr": 0.03152256243091156
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4663677130044843,
"acc_stderr": 0.033481800170603065,
"acc_norm": 0.4663677130044843,
"acc_norm_stderr": 0.033481800170603065
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5343511450381679,
"acc_stderr": 0.043749285605997376,
"acc_norm": 0.5343511450381679,
"acc_norm_stderr": 0.043749285605997376
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6033057851239669,
"acc_stderr": 0.04465869780531009,
"acc_norm": 0.6033057851239669,
"acc_norm_stderr": 0.04465869780531009
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5705521472392638,
"acc_stderr": 0.03889066619112723,
"acc_norm": 0.5705521472392638,
"acc_norm_stderr": 0.03889066619112723
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6965811965811965,
"acc_stderr": 0.03011821010694265,
"acc_norm": 0.6965811965811965,
"acc_norm_stderr": 0.03011821010694265
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5351213282247765,
"acc_stderr": 0.017835798806290642,
"acc_norm": 0.5351213282247765,
"acc_norm_stderr": 0.017835798806290642
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.49710982658959535,
"acc_stderr": 0.02691864538323901,
"acc_norm": 0.49710982658959535,
"acc_norm_stderr": 0.02691864538323901
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217892,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217892
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.0285803410651383,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.0285803410651383
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5080385852090032,
"acc_stderr": 0.028394421370984538,
"acc_norm": 0.5080385852090032,
"acc_norm_stderr": 0.028394421370984538
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.027777777777777797,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.027777777777777797
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.31560283687943264,
"acc_stderr": 0.027724989449509317,
"acc_norm": 0.31560283687943264,
"acc_norm_stderr": 0.027724989449509317
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.37353324641460234,
"acc_stderr": 0.012354994823515267,
"acc_norm": 0.37353324641460234,
"acc_norm_stderr": 0.012354994823515267
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3639705882352941,
"acc_stderr": 0.029227192460032025,
"acc_norm": 0.3639705882352941,
"acc_norm_stderr": 0.029227192460032025
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.02010258389588718,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.02010258389588718
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5673469387755102,
"acc_stderr": 0.03171752824062664,
"acc_norm": 0.5673469387755102,
"acc_norm_stderr": 0.03171752824062664
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6218905472636815,
"acc_stderr": 0.03428867848778658,
"acc_norm": 0.6218905472636815,
"acc_norm_stderr": 0.03428867848778658
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03811079669833531,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03811079669833531
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32313341493268055,
"mc1_stderr": 0.0163718362864546,
"mc2": 0.5073211090135596,
"mc2_stderr": 0.015470937650792245
},
"harness|winogrande|5": {
"acc": 0.6535122336227308,
"acc_stderr": 0.013373773411685642
},
"harness|gsm8k|5": {
"acc": 0.24639878695981804,
"acc_stderr": 0.011869498557755346
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
camara421/embdataset | ---
license: mit
---
|
wentingzhao/one-million-instructions | ---
dataset_info:
features:
- name: user
dtype: string
- name: system
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 327249922
num_examples: 2332040
download_size: 172927838
dataset_size: 327249922
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "one-million-instructions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TZ123321/b | ---
license: apache-2.0
---
|
Electrofried/promptmaster-data | ---
dataset_info:
features:
- name: data1
dtype: string
- name: data2
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 6217630
num_examples: 3795
- name: validation
num_bytes: 731990
num_examples: 474
- name: test
num_bytes: 705376
num_examples: 475
download_size: 3287389
dataset_size: 7654996
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
task_categories:
- text2text-generation
language:
- en
tags:
- not-for-all-audiences
- art
--- |
rmihiranga/guanaco-llama2-1k | ---
dataset_info:
features:
- name: Human
dtype: string
- name: Assistant
dtype: string
splits:
- name: train
num_bytes: 2554794
num_examples: 469
download_size: 971796
dataset_size: 2554794
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/Hatefulmemes_test_text_davinci_002_Hatefulmemes_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: raw_prediction
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_10
num_bytes: 366795867.0
num_examples: 1000
- name: fewshot_15
num_bytes: 369012572.0
num_examples: 1000
download_size: 727994919
dataset_size: 735808439.0
---
# Dataset Card for "Hatefulmemes_test_text_davinci_002_Hatefulmemes_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lmms-lab/ScienceQA-IMG | ---
dataset_info:
features:
- name: image
dtype: image
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype: int8
- name: hint
dtype: string
- name: task
dtype: string
- name: grade
dtype: string
- name: subject
dtype: string
- name: topic
dtype: string
- name: category
dtype: string
- name: skill
dtype: string
- name: lecture
dtype: string
- name: solution
dtype: string
splits:
- name: train
num_bytes: 206256098.99371365
num_examples: 6218
- name: validation
num_bytes: 69283708.62697478
num_examples: 2097
- name: test
num_bytes: 65753122.30087244
num_examples: 2017
download_size: 663306124
dataset_size: 341292929.9215609
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
<p align="center" width="100%">
<img src="https://i.postimg.cc/g0QRgMVv/WX20240228-113337-2x.png" width="100%" height="80%">
</p>
# Large-scale Multi-modality Models Evaluation Suite
> Accelerating the development of large-scale multi-modality models (LMMs) with `lmms-eval`
🏠 [Homepage](https://lmms-lab.github.io/) | 📚 [Documentation](docs/README.md) | 🤗 [Huggingface Datasets](https://huggingface.co/lmms-lab)
# This Dataset
This is a formatted and filtered version of [derek-thomas/ScienceQA](https://huggingface.co/datasets/derek-thomas/ScienceQA) with only image instances. It is used in our `lmms-eval` pipeline to allow for one-click evaluations of large multi-modality models.
```
@inproceedings{lu2022learn,
title={Learn to Explain: Multimodal Reasoning via Thought Chains for Science Question Answering},
author={Lu, Pan and Mishra, Swaroop and Xia, Tony and Qiu, Liang and Chang, Kai-Wei and Zhu, Song-Chun and Tafjord, Oyvind and Clark, Peter and Ashwin Kalyan},
booktitle={The 36th Conference on Neural Information Processing Systems (NeurIPS)},
year={2022}
}
``` |
arthurmluz/wikilingua_data-xlsum_cstnews_results | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: summary
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
- name: moverScore
dtype: float64
splits:
- name: validation
num_bytes: 23663943
num_examples: 8165
download_size: 14055635
dataset_size: 23663943
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "wikilingua_data-xlsumm_cstnews_results"
rouge={'rouge1': 0.22730958909234303, 'rouge2': 0.05480148947185013, 'rougeL': 0.1484336497540636, 'rougeLsum': 0.1484336497540636}
Bert={'precision': 0.6786886892651607, 'recall': 0.7067214733716248, 'f1': 0.6914363930397652}
mover = 0.5873519688127872 |
tyzhu/synpre_extract_q10_a5_1M_q_middle | ---
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: train
path: data/train-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: validation
num_bytes: 9241458
num_examples: 9777
- name: train
num_bytes: 925944617
num_examples: 976352
download_size: 545393918
dataset_size: 935186075
---
# Dataset Card for "synpre_extract_q10_a5_1M_q_middle"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1712941655 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 19660
num_examples: 44
download_size: 13542
dataset_size: 19660
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712941655"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
venkat-srinivasan-nexusflow/multiapi_prototype_VT_dec11 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prediction
dtype: string
- name: ground_truth
dtype: string
- name: correctness
dtype: int64
splits:
- name: standard
num_bytes: 55371
num_examples: 151
download_size: 21433
dataset_size: 55371
configs:
- config_name: default
data_files:
- split: standard
path: data/standard-*
---
|
result-kand2-sdxl-wuerst-karlo/78fe0016 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 173
num_examples: 10
download_size: 1317
dataset_size: 173
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "78fe0016"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Seanxh/twitter_dataset_1713203384 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 123776
num_examples: 290
download_size: 47072
dataset_size: 123776
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bigscience-data/roots_eu_wikisource | ---
language: eu
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_eu_wikisource
# wikisource_filtered
- Dataset uid: `wikisource_filtered`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 2.6306 % of total
- 12.7884 % of fr
- 19.8886 % of indic-bn
- 20.9966 % of indic-ta
- 2.3478 % of ar
- 4.7068 % of indic-hi
- 18.0998 % of indic-te
- 1.7155 % of es
- 19.4800 % of indic-kn
- 9.1737 % of indic-ml
- 17.1771 % of indic-mr
- 17.1870 % of indic-gu
- 70.3687 % of indic-as
- 1.0165 % of pt
- 7.8642 % of indic-pa
- 1.3501 % of vi
- 4.9411 % of indic-or
- 0.5307 % of ca
- 2.3593 % of id
- 1.5928 % of eu
### BigScience processing steps
#### Filters applied to: fr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: indic-bn
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ta
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: ar
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: es
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: indic-kn
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- remove_wiki_mojibake
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-gu
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-as
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
#### Filters applied to: pt
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-pa
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: vi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-or
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
#### Filters applied to: ca
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: id
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: eu
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
|
as-cle-bert/VirBiCla-training | ---
license: mit
---
# Dataset Card for VirBiCla-training
VirBiCla is a ML-based viral DNA detector designed for long-read sequencing metagenomics.
This dataset is a support dataset for training the base ML model.
## Dataset Details
### Dataset Description
- **Curated by:** [Astra Bertelli](https://astrabert.vercel.app/)
- **License:** MIT License
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [GitHub repository for VirBiCla](https://github.com/AstraBert/VirBiCla)
## Uses
This dataset is intended as support for training the base VirBiCla model
## Dataset Structure
Dataset is a CSV file composed of 60.003 record sequences (coming from RefSeq 16S bacterial rRNA, 18S fungal rRNA, SSU eukaryotic rRNA and RefSeq viral genomes) evaluated on 13 features.
Features are:
- Domain
- A, T, C and G proportion
- Percentage of A, T, C and G homopolimeric regions
- Gene density
- Entropy
- Effective Number of Codons (codon usage metrics)
## Dataset Creation
Find everything that is needed for Dataset creation on [VirBiCla website](https://astrabert.github.io/VirBiCla)
## Bias, Risks, and Limitations
The dataset is mainly directed towards amplicon-sequencing and long-read sequencing, which are the best use cases for VirBiCla.
## Citation
Please consider cite the author of this work (Astra Bertelli) and VirBiCla [GitHub repository](https://github.com/AstraBert/VirBiCla) when using this dataset or the associated model.
|
awacke1/DNA-Aaron-C-Wacker-Open-Source-Genome-Project | ---
license: mit
---
|
lapp0/hotpot_query_expansion_synthetic_annotated | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: input_entities
sequence: string
- name: output_entities
sequence: string
- name: out_in_ent_score
dtype: float64
- name: in_out_ent_score
dtype: float64
- name: pair_score
dtype: float32
splits:
- name: train
num_bytes: 27024967
num_examples: 85925
- name: eval
num_bytes: 1418300
num_examples: 4522
download_size: 19029050
dataset_size: 28443267
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
---
|
Yusen/Sovits_ATRI | ---
license: other
---
|
d4rk3r/invoices | ---
license: unlicense
---
|
bigscience-data/roots_indic-bn_ted_talks_iwslt | ---
language: bn
license: cc-by-nc-nd-4.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_indic-bn_ted_talks_iwslt
# WIT Ted Talks
- Dataset uid: `ted_talks_iwslt`
### Description
The Web Inventory Talk is a collection of the original Ted talks and their translated version. The translations are available in more than 109+ languages, though the distribution is not uniform.
### Homepage
https://github.com/huggingface/datasets/blob/master/datasets/ted_talks_iwslt/README.md
### Licensing
- open license
- cc-by-nc-4.0: Creative Commons Attribution Non Commercial 4.0 International
TED makes its collection of video recordings and transcripts of talks available under the Creative Commons BY-NC-ND license (look here). WIT3 acknowledges the authorship of TED talks (BY condition) and does not redistribute transcripts for commercial purposes (NC). As regards the integrity of the work (ND), WIT3 only changes the format of the container, while preserving the original contents. WIT3 aims to support research on human language processing as well as the diffusion of TED Talks!
### Speaker Locations
- Southern Europe
- Italy
### Sizes
- 0.0305 % of total
- 0.0736 % of ar
- 0.2002 % of pt
- 0.0128 % of zh
- 0.2236 % of vi
- 0.0330 % of fr
- 0.0545 % of es
- 0.0122 % of en
- 0.3704 % of id
- 0.0373 % of indic-hi
- 0.0330 % of indic-ta
- 0.1393 % of indic-mr
- 0.0305 % of ca
- 0.1179 % of indic-ur
- 0.0147 % of indic-bn
- 0.0240 % of indic-ml
- 0.0244 % of indic-te
- 0.0503 % of indic-gu
- 0.0211 % of indic-kn
- 0.0274 % of eu
- 0.0023 % of indic-as
- 0.0001 % of indic-pa
### BigScience processing steps
#### Filters applied to: ar
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: pt
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: zh
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: vi
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: fr
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: es
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: en
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: id
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ta
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: ca
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: indic-ur
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-bn
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-gu
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-kn
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: eu
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-as
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-pa
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
|
if001/oasst1_ja_ppl | ---
license: apache-2.0
language:
- ja
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: input_ppl
dtype: int64
- name: instruction_ppl
dtype: int64
- name: output_ppl
dtype: int64
- name: full_ppl
dtype: int64
splits:
- name: train
num_bytes: 60856874
num_examples: 55359
download_size: 27216157
dataset_size: 60856874
---
** 以下のrepositoryのforkです。 **
https://huggingface.co/datasets/kunishou/oasst1-89k-ja
instructionとinput、outputにまとめ、kenllmでperplexityのスコアが付与してあります。
perplexityの計算に用いたtokenizerはこちら
https://huggingface.co/if001/sentencepiece_ja
- instruction_ppl: instructionのみのperplexity
- output_ppl: outputのみのperplexity
- full_ppl: instructionとoutputを合わせ、instruction用の文章にしたperplexity
|
presencesw/contract-nli | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: gold_label
dtype: string
splits:
- name: train
num_bytes: 80733252
num_examples: 7191
- name: test
num_bytes: 23823456
num_examples: 2091
- name: dev
num_bytes: 12720238
num_examples: 1037
download_size: 3153735
dataset_size: 117276946
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: dev
path: data/dev-*
---
|
joey234/mmlu-human_aging-rule-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 96712
num_examples: 223
download_size: 59712
dataset_size: 96712
---
# Dataset Card for "mmlu-human_aging-rule-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
UnderstandLing/oasst1_hi_threads | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 14169495
num_examples: 9486
- name: validation
num_bytes: 625488
num_examples: 408
download_size: 4666273
dataset_size: 14794983
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
rokset3/keystrokes136M_normalized_features | ---
dataset_info:
features:
- name: participant_id
dtype: int64
- name: section_id
dtype: int64
- name: keycode_ids
sequence: float64
- name: hl
sequence: float64
- name: il
sequence: float64
- name: pl
sequence: float64
- name: rl
sequence: float64
splits:
- name: train
num_bytes: 2312283700
num_examples: 1168095
- name: test
num_bytes: 2314948560
num_examples: 1168110
download_size: 818494072
dataset_size: 4627232260
---
# Dataset Card for "keystrokes136M_normalized_features"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-Blaise-g__SumPubmed-3c512f6e-12265641 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- Blaise-g/SumPubmed
eval_info:
task: summarization
model: Blaise-g/long_t5_global_large_baseline_pubmed
metrics: ['bertscore']
dataset_name: Blaise-g/SumPubmed
dataset_config: Blaise-g--SumPubmed
dataset_split: test
col_mapping:
text: text
target: abstract
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: Blaise-g/long_t5_global_large_baseline_pubmed
* Dataset: Blaise-g/SumPubmed
* Config: Blaise-g--SumPubmed
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Blaise-g](https://huggingface.co/Blaise-g) for evaluating this model. |
Chunt0/atoul-12-1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: string
splits:
- name: train
num_bytes: 4729106.0
num_examples: 10
download_size: 4716614
dataset_size: 4729106.0
---
# Dataset Card for "atoul-12-1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lovodkin93/FuseReviews | ---
license: apache-2.0
---
|
ResplendentAI/Sissification_Hypno_1k | ---
license: apache-2.0
task_categories:
- question-answering
language:
- en
tags:
- not-for-all-audiences
pretty_name: Sissification Hypno
size_categories:
- n<1K
---
NSFW sissification hypno dataset featuring content which many users will find disturbing. Use at your own discretion. |
sade-adrien/context_extension-mistral-16k | ---
dataset_info:
features:
- name: raw_content
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 13206603486
num_examples: 30000
download_size: 5395605016
dataset_size: 13206603486
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "context_extension-mistral-16k"
!!! idx=27710 has length 32717
remove it with dataset['train'] = dataset['train'].select((i for i in range(len(dataset['train'])) if i != 27710))
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
creative-graphic-design/Rico | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories: []
source_datasets:
- original
task_categories:
- other
task_ids: []
pretty_name: Rico
tags:
- graphic design
dataset_info:
- config_name: default
features:
- name: ui_number
dtype: int32
- name: app_package_name
dtype: string
- name: interaction_trace_number
dtype: string
- name: ui_number_in_trace
dtype: string
splits:
- name: metadata
num_bytes: 2880097
num_examples: 66261
download_size: 960113
dataset_size: 2880097
- config_name: ui-screenshots-and-view-hierarchies
features:
- name: screenshot
dtype: image
- name: activity_name
dtype: string
- name: activity
struct:
- name: root
struct:
- name: abs_pos
dtype: bool
- name: adapter_view
dtype: bool
- name: ancestors
sequence: string
- name: bounds
sequence: int64
- name: clickable
dtype: bool
- name: content_desc
sequence: string
- name: draw
dtype: bool
- name: enabled
dtype: bool
- name: focusable
dtype: bool
- name: focused
dtype: bool
- name: klass
dtype: string
- name: long_clickable
dtype: bool
- name: package
dtype: string
- name: pressed
dtype: string
- name: pointer
dtype: string
- name: rel_bounds
sequence: int64
- name: resource_id
dtype: string
- name: scrollable_horizontal
dtype: bool
- name: scrollable_vertical
dtype: bool
- name: selected
dtype: bool
- name: visibility
dtype: string
- name: visible_to_user
dtype: bool
- name: children
sequence:
sequence:
- name: abs_pos
dtype: bool
- name: adapter_view
dtype: bool
- name: ancestors
sequence: string
- name: bounds
sequence: int64
- name: clickable
dtype: bool
- name: content_desc
sequence: string
- name: draw
dtype: bool
- name: enabled
dtype: bool
- name: focusable
dtype: bool
- name: focused
dtype: bool
- name: klass
dtype: string
- name: long_clickable
dtype: bool
- name: package
dtype: string
- name: pressed
dtype: string
- name: pointer
dtype: string
- name: rel_bounds
sequence: int64
- name: resource_id
dtype: string
- name: scrollable_horizontal
dtype: bool
- name: scrollable_vertical
dtype: bool
- name: selected
dtype: bool
- name: visibility
dtype: string
- name: visible_to_user
dtype: bool
- name: added_fragments
sequence: string
- name: active_fragments
sequence: string
- name: is_keyboard_deployed
dtype: bool
- name: request_id
dtype: string
splits:
- name: train
num_bytes: 7235922008.75
num_examples: 56322
- name: validation
num_bytes: 425096153.75
num_examples: 3314
- name: test
num_bytes: 846527051.875
num_examples: 6625
download_size: 6478456942
dataset_size: 8507545214.375
configs:
- config_name: default
data_files:
- split: metadata
path: data/metadata-*
- config_name: ui-screenshots-and-view-hierarchies
data_files:
- split: train
path: ui-screenshots-and-view-hierarchies/train-*
- split: validation
path: ui-screenshots-and-view-hierarchies/validation-*
- split: test
path: ui-screenshots-and-view-hierarchies/test-*
---
# Dataset Card for Rico
[](https://github.com/shunk031/huggingface-datasets_Rico/actions/workflows/ci.yaml)
## Table of Contents
- [Dataset Card Creation Guide](#dataset-card-creation-guide)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** http://www.interactionmining.org/rico.html
- **Repository:** https://github.com/shunk031/huggingface-datasets_Rico
- **Paper (UIST2017):** https://dl.acm.org/doi/10.1145/3126594.3126651
### Dataset Summary
Rico: A Mobile App Dataset for Building Data-Driven Design Applications
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
- UI screenshots and view hierarchies
```python
import datasets as ds
dataset = ds.load_dataset(
path="shunk031/Rico",
name="ui-screenshots-and-view-hierarchies",
)
```
- UI metadata
```python
import datasets as ds
dataset = ds.load_dataset(
path="shunk031/Rico",
name="ui-metadata",
)
```
- UI layout vectors
```python
import datasets as ds
dataset = ds.load_dataset(
path="shunk031/Rico",
name="ui-layout-vectors",
)
```
- Interaction traces
```python
import datasets as ds
dataset = ds.load_dataset(
path="shunk031/Rico",
name="interaction-traces",
)
```
- [WIP] Animations
```python
import datasets as ds
dataset = ds.load_dataset(
path="shunk031/Rico",
name="animations",
)
```
- Play store metadata
```python
import datasets as ds
dataset = ds.load_dataset(
path="shunk031/Rico",
name="play-store-metadata",
)
```
- UI screenshots and hierarchies with semantic annotations
```python
import datasets as ds
dataset = ds.load_dataset(
path="shunk031/Rico",
name="ui-screenshots-and-hierarchies-with-semantic-annotations",
)
```
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```bibtex
@inproceedings{deka2017rico,
title={Rico: A mobile app dataset for building data-driven design applications},
author={Deka, Biplab and Huang, Zifeng and Franzen, Chad and Hibschman, Joshua and Afergan, Daniel and Li, Yang and Nichols, Jeffrey and Kumar, Ranjitha},
booktitle={Proceedings of the 30th annual ACM symposium on user interface software and technology},
pages={845--854},
year={2017}
}
```
### Contributions
Thanks to [DATA DRIVEN DESIGN GROUP UNIVERSITY OF ILLINOIS AT URBANA-CHAMPAIGN](http://ranjithakumar.net/) for creating this dataset.
|
sadrasabouri/ShahNegar | ---
annotations_creators:
- machine-generated
language_creators:
- expert-generated
language:
- en
license:
- mit
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- image-to-text
- text-to-image
task_ids:
- image-captioning
pretty_name: ShahNegar
---
# ShahNegar (A Plotted version of The Shahnameh)
This dataset is a plotted version of Ferdowsi's Shahnameh (which is a highly-regarded ancient set of Farsi poems) generated using DALL-E mini (aka [craiyon](https://www.craiyon.com/)). You can use this dataset using the code below:
```python
from datasets import load_dataset
dataset = load_dataset("sadrasabouri/ShahNegar")
```
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Paper:**
- **Point of Contact:** [Sadra Sabouri](mailto:sabouri.sadra@gmail.com)
### Dataset Summary
This dataset contains more than 30K images with their corresponding text from the Shahnameh. For each Shahnameh paragraph, we generated at most 9 images. Images corresponding to the same paragraphs have the same `id` field. There was a human annotation post-process in which we removed some harmful/private generated images from the dataset. After all we reached to more than 30K, 256 * 256 images.
### Supported Tasks and Leaderboards
The main purpose of making this dataset open source is because of its artistic value, but it can also be used for the below tasks:
+ text-to-image
+ image-to-text (image captioning)
### Languages
The Shahnameh was generally written in Farsi (Persian) but the translated version we used for this dataset - [satoor](https://www.sattor.com/english/Shahnameh.pdf) - was completely in English with no alignments for the corresponding Farsi poem. We are planning to add another field to dataset entries which is the corresponding Farsi poem as soon as possible.
## Dataset Structure
### Data Fields
Here is an instance of our dataset:
```json
{
"image": <PIL Image Bytes>,
"id": 0,
"text": "He took up his abode in the mountains, and clad himself and his people in tiger-skins, and from him sprang all kindly nurture and the arts of clothing, till then unknown."
}
```
+ `image`: the image for given text.
+ `id`: the id for the text (**Not for the image**).
+ `text`: the English text for the image.
### Data Splits
This dataset has only a split (`train` split).
## Dataset Creation
The translated version of the Shahnameh was generally derived from the [satoor](https://www.sattor.com/english/Shahnameh.pdf) website. We first extracted texts from the pdf. After that, we divided paragraphs into sentences and give each sentence to the DALL-E mini model through its online API. It generated nine images for each sentence. After a few annotations, we came up with more than 30000 images.
### Annotations
#### Annotation process
Through the process of image generation, we noticed a bias in the DALL-E models towards the word `iran`. It was biased so that each sentence with this given word would have pictures from Iran's political figures which were usually totally irrelevant. The annotation process mainly focused to deal with these pictures. We removed those images which seems to be harmful to those figures and/or were irrelevant to the context.
#### Who are the annotators?
Mahsa Namdar and Sadra Sabouri were the annotators of this dataset.
### Personal and Sensitive Information
Since the textual data is easily downloadable and the images were generated through an image generation model there shouldn't be any personal information in this dataset. Just in case you find something harmful or violating of one's personal information please let us know. We will take proper action as soon as possible.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is mainly aimed to release for its artistic value. The process of generating images for the Shahnameh - which is one of the most important Farsi poem books - is our precious contribution. This dataset is not only used for this purpose but also can as a dataset in image-to-text and text-to-image tasks.
### Discussion of Biases
The dataset's possible biases would come from the DALL-E mini biases. It's actually a good practice to check the dataset entries in order to find biases in that model. One it's worth mentioning in this work is the DALL-E mini model's bias for the word `iran` which nearly always comes up with images from political figures of this country.
### Other Known Limitations
There are constant debates in the literature about the limitations of machine-generated datasets. Some believe that since nowadays models are not perfect - and so do their output, it wouldn't be a good idea to use these artificially generated datasets as input to the new model. They suggest that by doing so we are actually limiting our accuracy by the model's accuracy which provided the primary dataset.
## Additional Information
### Dataset Curators
+ Emad Fatemizadeh: The general idea for generating a graphical version of Farsi poems was firstly introduced by him.
+ Sadra Sabouri: He looked up a translated version of the Shahnameh, extract and tokenized poems from it, and used the online DALL-E mini API to generate images from poems.
+ Mahsa Namdar: The process of annotation as a post-process on data has been held by her.
### Licensing Information
MIT
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@sadrasabouri](https://github.com/sadrasabouri) for adding this dataset.
|
thiennguyen1998/my_dataset | ---
license: mit
language:
- en
pretty_name: Company Information Dataset
size_categories:
- n<1K
--- |
ZiAngGu/omni3d_v3 | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: text
dtype: string
- name: label
sequence: string
- name: box2d_pro
sequence:
sequence:
sequence: int64
splits:
- name: train
num_bytes: 23927821021.36
num_examples: 220705
download_size: 25822293126
dataset_size: 23927821021.36
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "omni3d_v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PocketDoc/Floyd-Text-Adventures | ---
tags:
- not-for-all-audiences
task_categories:
- conversational
language:
- en
pretty_name: Floyd Text Adventures
---
This is the 'Floyd' text adventure dataset converted to a chat format with system messages. The system messages were randomly constructed from a table of phrases and templates. The original data can be found in the .7z archive.
**Credits:**
Thank you to VE Forbryderne from KoboldAI for scraping the dataset. |
mariiacarliinha/Chicoin | ---
license: openrail
---
|
CyberHarem/kotone_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kotone/コトネ (Pokémon)
This is the dataset of kotone/コトネ (Pokémon), containing 500 images and their tags.
The core tags of this character are `brown_hair, twintails, hat, brown_eyes, bow, cabbie_hat, long_hair, white_headwear, hat_bow, red_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 386.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kotone_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 265.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kotone_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 953 | 494.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kotone_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 360.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kotone_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 953 | 641.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kotone_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kotone_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, team_rocket_uniform, black_headwear, looking_at_viewer, thigh_boots, thighhighs, pokemon_(creature), white_gloves, open_mouth, blush, grey_gloves, hand_up, holding_poke_ball, jacket, logo, poke_ball_(basic), cosplay, eyelashes, skirt, smile, white_belt |
| 1 | 9 |  |  |  |  |  | 1girl, closed_mouth, looking_at_viewer, red_shirt, smile, eyelashes, blush, collarbone, solo, upper_body, blue_overalls, simple_background, white_background |
| 2 | 6 |  |  |  |  |  | 1girl, :d, blue_overalls, open_mouth, red_shirt, red_footwear, shoes, teeth, thighhighs, tongue, yellow_bag, riding_pokemon |
| 3 | 5 |  |  |  |  |  | 1girl, holding_poke_ball, overalls, poke_ball_(basic), red_shirt, simple_background, solo, white_background, closed_mouth, looking_at_viewer, smile, bag, blush |
| 4 | 5 |  |  |  |  |  | 1girl, blue_overalls, closed_mouth, red_footwear, red_shirt, shoes, eyelashes, holding_pokemon, sitting, smile, white_thighhighs, blush, grass, white_background |
| 5 | 8 |  |  |  |  |  | 1girl, :d, open_mouth, red_shirt, tongue, eyelashes, holding_pokemon, blue_overalls, blush_stickers, green_background, notice_lines, twitter_username |
| 6 | 19 |  |  |  |  |  | 1girl, hat_ribbon, overalls, red_ribbon, short_twintails, solo, smile, white_thighhighs |
| 7 | 7 |  |  |  |  |  | 1girl, hat_ribbon, open_mouth, overalls, red_ribbon, smile, blush, pokemon_(creature), thighhighs, solo |
| 8 | 15 |  |  |  |  |  | 1girl, official_alternate_costume, detached_sleeves, blue_kimono, pokemon_(creature), eyelashes, floral_print, holding, looking_at_viewer, open_mouth, :d, hand_up, obi, yukata |
| 9 | 15 |  |  |  |  |  | 1girl, hetero, penis, 1boy, hat_ribbon, red_ribbon, blush, solo_focus, cum, nipples, pussy, medium_breasts, sex, thighhighs, one_eye_closed, open_mouth, spread_legs, pokemon_(creature), short_twintails, uncensored, vaginal |
| 10 | 12 |  |  |  |  |  | hair_flower, 1girl, blue_flower, detached_sleeves, official_alternate_costume, :d, bangs, bare_shoulders, green_dress, open_mouth, solo, blush, breasts, eyelashes, pokemon_(creature), sandals, brown_footwear, yellow_hairband |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | team_rocket_uniform | black_headwear | looking_at_viewer | thigh_boots | thighhighs | pokemon_(creature) | white_gloves | open_mouth | blush | grey_gloves | hand_up | holding_poke_ball | jacket | logo | poke_ball_(basic) | cosplay | eyelashes | skirt | smile | white_belt | closed_mouth | red_shirt | collarbone | solo | upper_body | blue_overalls | simple_background | white_background | :d | red_footwear | shoes | teeth | tongue | yellow_bag | riding_pokemon | overalls | bag | holding_pokemon | sitting | white_thighhighs | grass | blush_stickers | green_background | notice_lines | twitter_username | hat_ribbon | red_ribbon | short_twintails | official_alternate_costume | detached_sleeves | blue_kimono | floral_print | holding | obi | yukata | hetero | penis | 1boy | solo_focus | cum | nipples | pussy | medium_breasts | sex | one_eye_closed | spread_legs | uncensored | vaginal | hair_flower | blue_flower | bangs | bare_shoulders | green_dress | breasts | sandals | brown_footwear | yellow_hairband |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:----------------------|:-----------------|:--------------------|:--------------|:-------------|:---------------------|:---------------|:-------------|:--------|:--------------|:----------|:--------------------|:---------|:-------|:--------------------|:----------|:------------|:--------|:--------|:-------------|:---------------|:------------|:-------------|:-------|:-------------|:----------------|:--------------------|:-------------------|:-----|:---------------|:--------|:--------|:---------|:-------------|:-----------------|:-----------|:------|:------------------|:----------|:-------------------|:--------|:-----------------|:-------------------|:---------------|:-------------------|:-------------|:-------------|:------------------|:-----------------------------|:-------------------|:--------------|:---------------|:----------|:------|:---------|:---------|:--------|:-------|:-------------|:------|:----------|:--------|:-----------------|:------|:-----------------|:--------------|:-------------|:----------|:--------------|:--------------|:--------|:-----------------|:--------------|:----------|:----------|:-----------------|:------------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | | | X | | | | | | X | | | | | | | | X | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | | | | X | | | X | | | | | | | | | | | | | | X | | | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | X | | | | | | X | | | X | | | X | | | | X | | X | X | | X | | | X | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | | | | | | | X | | | | | | | | X | | X | | X | X | | | | X | | X | | X | X | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | | | | | | | | X | | | | | | | | | X | | | | | X | | | | X | | | X | | | | X | | | | | X | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 19 |  |  |  |  |  | X | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | X | | | | X | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | | | | | X | X | | X | X | | | | | | | | | | X | | | | | X | | | | | | | | | | | | X | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 15 |  |  |  |  |  | X | | | X | | | X | | X | | | X | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 15 |  |  |  |  |  | X | | | | | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 10 | 12 |  |  |  |  |  | X | | | | | | X | | X | X | | | | | | | | X | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_pszemraj__pythia-31m-KI_v1-2048-scratch | ---
pretty_name: Evaluation run of pszemraj/pythia-31m-KI_v1-2048-scratch
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [pszemraj/pythia-31m-KI_v1-2048-scratch](https://huggingface.co/pszemraj/pythia-31m-KI_v1-2048-scratch)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pszemraj__pythia-31m-KI_v1-2048-scratch\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T08:48:56.484110](https://huggingface.co/datasets/open-llm-leaderboard/details_pszemraj__pythia-31m-KI_v1-2048-scratch/blob/main/results_2023-10-28T08-48-56.484110.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.009437919463087249,\n\
\ \"em_stderr\": 0.0009901902239103783,\n \"f1\": 0.015236996644295321,\n\
\ \"f1_stderr\": 0.0010823937767906837,\n \"acc\": 0.25887924230465664,\n\
\ \"acc_stderr\": 0.007021809798087482\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.009437919463087249,\n \"em_stderr\": 0.0009901902239103783,\n\
\ \"f1\": 0.015236996644295321,\n \"f1_stderr\": 0.0010823937767906837\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5177584846093133,\n\
\ \"acc_stderr\": 0.014043619596174964\n }\n}\n```"
repo_url: https://huggingface.co/pszemraj/pythia-31m-KI_v1-2048-scratch
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|arc:challenge|25_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T08_48_56.484110
path:
- '**/details_harness|drop|3_2023-10-28T08-48-56.484110.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T08-48-56.484110.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T08_48_56.484110
path:
- '**/details_harness|gsm8k|5_2023-10-28T08-48-56.484110.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T08-48-56.484110.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hellaswag|10_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T05-01-19.324903.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-15T05-01-19.324903.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-15T05-01-19.324903.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T08_48_56.484110
path:
- '**/details_harness|winogrande|5_2023-10-28T08-48-56.484110.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T08-48-56.484110.parquet'
- config_name: results
data_files:
- split: 2023_09_15T05_01_19.324903
path:
- results_2023-09-15T05-01-19.324903.parquet
- split: 2023_10_28T08_48_56.484110
path:
- results_2023-10-28T08-48-56.484110.parquet
- split: latest
path:
- results_2023-10-28T08-48-56.484110.parquet
---
# Dataset Card for Evaluation run of pszemraj/pythia-31m-KI_v1-2048-scratch
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/pszemraj/pythia-31m-KI_v1-2048-scratch
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [pszemraj/pythia-31m-KI_v1-2048-scratch](https://huggingface.co/pszemraj/pythia-31m-KI_v1-2048-scratch) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pszemraj__pythia-31m-KI_v1-2048-scratch",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T08:48:56.484110](https://huggingface.co/datasets/open-llm-leaderboard/details_pszemraj__pythia-31m-KI_v1-2048-scratch/blob/main/results_2023-10-28T08-48-56.484110.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.009437919463087249,
"em_stderr": 0.0009901902239103783,
"f1": 0.015236996644295321,
"f1_stderr": 0.0010823937767906837,
"acc": 0.25887924230465664,
"acc_stderr": 0.007021809798087482
},
"harness|drop|3": {
"em": 0.009437919463087249,
"em_stderr": 0.0009901902239103783,
"f1": 0.015236996644295321,
"f1_stderr": 0.0010823937767906837
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5177584846093133,
"acc_stderr": 0.014043619596174964
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_cola_come_future | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 4025
num_examples: 40
- name: test
num_bytes: 4005
num_examples: 40
- name: train
num_bytes: 30454
num_examples: 339
download_size: 21141
dataset_size: 38484
---
# Dataset Card for "MULTI_VALUE_cola_come_future"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
philschmid/guanaco-oai-style | ---
license: apache-2.0
---
|
chiayewken/commonsense-qa-2 | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: confidence
dtype: float64
- name: date
dtype: string
- name: relational_prompt
dtype: string
- name: topic_prompt
dtype: string
- name: relational_prompt_used
dtype: bool
- name: topic_prompt_used
dtype: bool
- name: validations
sequence: string
splits:
- name: train
num_bytes: 1541070
num_examples: 9264
- name: validation
num_bytes: 430506
num_examples: 2541
download_size: 1081931
dataset_size: 1971576
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
royallab/PIPPA-cleaned | ---
license: apache-2.0
tags:
- not-for-all-audiences
- conversational
- roleplay
- custom-format
- a.
pretty_name: PIPPA - Personal Interaction Pairs Between People and AI
viewer: false
---
Cleaned and/or fixed dataset of PIPPA (https://huggingface.co/datasets/PygmalionAI/PIPPA), without the formatting and random char issues.
Can be used as calibration dataset for exllamav2, like for goliath-rpcal (https://huggingface.co/Panchovix/goliath-120b-exl2-rpcal)
All credits to the Pygmalion team and Undi. |
lhallee/triplets | ---
dataset_info:
features:
- name: positives
dtype: string
- name: anchors
dtype: string
- name: negatives
dtype: string
- name: aspects
dtype: int64
splits:
- name: valid
num_bytes: 8811058.316717582
num_examples: 5000
- name: test
num_bytes: 87523766.68328242
num_examples: 49667
- name: train
num_bytes: 1184298382
num_examples: 807412
download_size: 1424567824
dataset_size: 1280633207.0
configs:
- config_name: default
data_files:
- split: valid
path: data/valid-*
- split: test
path: data/test-*
- split: train
path: data/train-*
---
|
joey234/mmlu-high_school_macroeconomics-neg-prepend-fix | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 5403
num_examples: 5
- name: test
num_bytes: 994039
num_examples: 390
download_size: 12073
dataset_size: 999442
---
# Dataset Card for "mmlu-high_school_macroeconomics-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
orisuchy/Descriptive_Sentences_He | ---
license: afl-3.0
---
|
joey234/mmlu-security_studies-neg-prepend-verbal | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: neg_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
splits:
- name: dev
num_bytes: 19892
num_examples: 5
- name: test
num_bytes: 7752355
num_examples: 245
download_size: 430057
dataset_size: 7772247
---
# Dataset Card for "mmlu-security_studies-neg-prepend-verbal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cartesinus/iva_mt_wslot-exp | ---
dataset_info:
features:
- name: id
dtype: string
- name: locale
dtype: string
- name: origin
dtype: string
- name: partition
dtype: string
- name: translation_utt
dtype:
translation:
languages:
- en
- pl
- name: translation_xml
dtype:
translation:
languages:
- en
- pl
- name: src_bio
dtype: string
- name: tgt_bio
dtype: string
task_categories:
- translation
language:
- en
- pl
- de
- es
- sv
tags:
- machine translation
- nlu
- natural-language-understanding
- virtual assistant
pretty_name: Machine translation for NLU with slot transfer
size_categories:
- 10K<n<100K
license: cc-by-4.0
---
# Machine translation dataset for NLU (Virual Assistant) with slot transfer between languages
## Dataset Summary
Disclaimer: This is for research purposes only. Please have a look at the license section below. Some of the datasets used to construct IVA_MT have an unknown license.
IVA_MT is a machine translation dataset that can be used to train, adapt and evaluate MT models used in Virtual Assistant NLU context (e.g. to translate trainig corpus of NLU).
## Dataset Composition
### en-pl
| Corpus | Train | Dev | Test |
|----------------------------------------------------------------------|--------|-------|-------|
| [Massive 1.1](https://huggingface.co/datasets/AmazonScience/massive) | 11514 | 2033 | 2974 |
| [Leyzer 0.2.0](https://github.com/cartesinus/leyzer/tree/0.2.0) | 3974 | 701 | 1380 |
| [OpenSubtitles from OPUS](https://opus.nlpl.eu/OpenSubtitles-v1.php) | 2329 | 411 | 500 |
| [KDE from OPUS](https://opus.nlpl.eu/KDE4.php) | 1154 | 241 | 241 |
| [CCMatrix from Opus](https://opus.nlpl.eu/CCMatrix.php) | 1096 | 232 | 237 |
| [Ubuntu from OPUS](https://opus.nlpl.eu/Ubuntu.php) | 281 | 60 | 59 |
| [Gnome from OPUS](https://opus.nlpl.eu/GNOME.php) | 14 | 3 | 3 |
| *total* | 20362 | 3681 | 5394 |
### en-de
| Corpus | Train | Dev | Test |
|----------------------------------------------------------------------|--------|-------|-------|
| [Massive 1.1](https://huggingface.co/datasets/AmazonScience/massive) | 7536 | 1346 | 1955 |
### en-es
| Corpus | Train | Dev | Test |
|----------------------------------------------------------------------|--------|-------|-------|
| [Massive 1.1](https://huggingface.co/datasets/AmazonScience/massive) | 8415 | 1526 | 2202 |
### en-sv
| Corpus | Train | Dev | Test |
|----------------------------------------------------------------------|--------|-------|-------|
| [Massive 1.1](https://huggingface.co/datasets/AmazonScience/massive) | 7540 | 1360 | 1921 |
## Tools
Scripts used to generate this dataset can be found on [github](https://github.com/cartesinus/iva_mt).
## License
This is a composition of 7 datasets, and the license is as defined in original release:
- MASSIVE: [CC-BY 4.0](https://huggingface.co/datasets/AmazonScience/massive/blob/main/LICENSE)
- Leyzer: [CC BY-NC 4.0](https://github.com/cartesinus/leyzer/blob/master/LICENSE)
- OpenSubtitles: unknown
- KDE: [GNU Public License](https://l10n.kde.org/about.php)
- CCMatrix: no license given, therefore assuming it is LASER project license [BSD](https://github.com/facebookresearch/LASER/blob/main/LICENSE)
- Ubuntu: [GNU Public License](https://help.launchpad.net/Legal)
- Gnome: unknown
|
juancopi81/diana_uribe | ---
task_categories:
- automatic-speech-recognition
dataset_info:
features:
- name: CHANNEL_NAME
dtype: string
- name: URL
dtype: string
- name: TITLE
dtype: string
- name: DESCRIPTION
dtype: string
- name: TRANSCRIPTION
dtype: string
- name: SEGMENTS
dtype: string
splits:
- name: train
num_bytes: 23288573
num_examples: 370
download_size: 11339946
dataset_size: 23288573
tags:
- whisper
- whispering
- base
---
# Dataset Card for "diana_uribe"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Gille__StrangeMerges_36-7B-slerp | ---
pretty_name: Evaluation run of Gille/StrangeMerges_36-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Gille/StrangeMerges_36-7B-slerp](https://huggingface.co/Gille/StrangeMerges_36-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_36-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T13:37:17.210533](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_36-7B-slerp/blob/main/results_2024-03-21T13-37-17.210533.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.653430155497492,\n\
\ \"acc_stderr\": 0.03207425213382696,\n \"acc_norm\": 0.6530665733578745,\n\
\ \"acc_norm_stderr\": 0.032741792156283166,\n \"mc1\": 0.6181150550795593,\n\
\ \"mc1_stderr\": 0.017008101939163498,\n \"mc2\": 0.7704842612550872,\n\
\ \"mc2_stderr\": 0.013880019850266273\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.697098976109215,\n \"acc_stderr\": 0.013428241573185349,\n\
\ \"acc_norm\": 0.7261092150170648,\n \"acc_norm_stderr\": 0.013032004972989506\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7172873929496116,\n\
\ \"acc_stderr\": 0.0044939755273867375,\n \"acc_norm\": 0.8882692690699064,\n\
\ \"acc_norm_stderr\": 0.003143910361779262\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055273,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055273\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"\
acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971114,\n\
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971114\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524565,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524565\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3973509933774834,\n \"acc_stderr\": 0.0399552400768168,\n \"acc_norm\"\
: 0.3973509933774834,\n \"acc_norm_stderr\": 0.0399552400768168\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n\
\ \"acc_stderr\": 0.015703498348461763,\n \"acc_norm\": 0.8403669724770643,\n\
\ \"acc_norm_stderr\": 0.015703498348461763\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n\
\ \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.02531049537694486,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.02531049537694486\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4569832402234637,\n\
\ \"acc_stderr\": 0.01666049858050917,\n \"acc_norm\": 0.4569832402234637,\n\
\ \"acc_norm_stderr\": 0.01666049858050917\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.024748624490537368,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.024748624490537368\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n\
\ \"acc_stderr\": 0.01275197796767601,\n \"acc_norm\": 0.47327249022164275,\n\
\ \"acc_norm_stderr\": 0.01275197796767601\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6181150550795593,\n\
\ \"mc1_stderr\": 0.017008101939163498,\n \"mc2\": 0.7704842612550872,\n\
\ \"mc2_stderr\": 0.013880019850266273\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8429360694554064,\n \"acc_stderr\": 0.010226303949598484\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6853677028051555,\n \
\ \"acc_stderr\": 0.012791037227336034\n }\n}\n```"
repo_url: https://huggingface.co/Gille/StrangeMerges_36-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|arc:challenge|25_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|gsm8k|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hellaswag|10_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T13-37-17.210533.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T13-37-17.210533.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- '**/details_harness|winogrande|5_2024-03-21T13-37-17.210533.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T13-37-17.210533.parquet'
- config_name: results
data_files:
- split: 2024_03_21T13_37_17.210533
path:
- results_2024-03-21T13-37-17.210533.parquet
- split: latest
path:
- results_2024-03-21T13-37-17.210533.parquet
---
# Dataset Card for Evaluation run of Gille/StrangeMerges_36-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_36-7B-slerp](https://huggingface.co/Gille/StrangeMerges_36-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_36-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T13:37:17.210533](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_36-7B-slerp/blob/main/results_2024-03-21T13-37-17.210533.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.653430155497492,
"acc_stderr": 0.03207425213382696,
"acc_norm": 0.6530665733578745,
"acc_norm_stderr": 0.032741792156283166,
"mc1": 0.6181150550795593,
"mc1_stderr": 0.017008101939163498,
"mc2": 0.7704842612550872,
"mc2_stderr": 0.013880019850266273
},
"harness|arc:challenge|25": {
"acc": 0.697098976109215,
"acc_stderr": 0.013428241573185349,
"acc_norm": 0.7261092150170648,
"acc_norm_stderr": 0.013032004972989506
},
"harness|hellaswag|10": {
"acc": 0.7172873929496116,
"acc_stderr": 0.0044939755273867375,
"acc_norm": 0.8882692690699064,
"acc_norm_stderr": 0.003143910361779262
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055273,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055273
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971114,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971114
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524565,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524565
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.0399552400768168,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.0399552400768168
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461763,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461763
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.02531049537694486,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.02531049537694486
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4569832402234637,
"acc_stderr": 0.01666049858050917,
"acc_norm": 0.4569832402234637,
"acc_norm_stderr": 0.01666049858050917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.024748624490537368,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.024748624490537368
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47327249022164275,
"acc_stderr": 0.01275197796767601,
"acc_norm": 0.47327249022164275,
"acc_norm_stderr": 0.01275197796767601
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6181150550795593,
"mc1_stderr": 0.017008101939163498,
"mc2": 0.7704842612550872,
"mc2_stderr": 0.013880019850266273
},
"harness|winogrande|5": {
"acc": 0.8429360694554064,
"acc_stderr": 0.010226303949598484
},
"harness|gsm8k|5": {
"acc": 0.6853677028051555,
"acc_stderr": 0.012791037227336034
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_LeroyDyer__Mixtral_Uncensored | ---
pretty_name: Evaluation run of LeroyDyer/Mixtral_Uncensored
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LeroyDyer/Mixtral_Uncensored](https://huggingface.co/LeroyDyer/Mixtral_Uncensored)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LeroyDyer__Mixtral_Uncensored\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T00:24:03.824734](https://huggingface.co/datasets/open-llm-leaderboard/details_LeroyDyer__Mixtral_Uncensored/blob/main/results_2024-03-22T00-24-03.824734.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6194310965439249,\n\
\ \"acc_stderr\": 0.03297706356835281,\n \"acc_norm\": 0.6236669140168419,\n\
\ \"acc_norm_stderr\": 0.033646817819223775,\n \"mc1\": 0.4810281517747858,\n\
\ \"mc1_stderr\": 0.017490896405762346,\n \"mc2\": 0.6586281643934825,\n\
\ \"mc2_stderr\": 0.015314973601082471\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5998293515358362,\n \"acc_stderr\": 0.014317197787809169,\n\
\ \"acc_norm\": 0.6382252559726962,\n \"acc_norm_stderr\": 0.014041957945038085\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.662617008564031,\n\
\ \"acc_stderr\": 0.0047185047710837655,\n \"acc_norm\": 0.8406691894045011,\n\
\ \"acc_norm_stderr\": 0.003652363253289593\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n\
\ \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.7451612903225806,\n\
\ \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229865,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229865\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.02541634309630643,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.02541634309630643\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130956,\n\
\ \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130956\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.02931820364520686,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.02931820364520686\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8018348623853211,\n \"acc_stderr\": 0.01709057380421791,\n \"\
acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.01709057380421791\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n\
\ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841403,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841403\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7931034482758621,\n\
\ \"acc_stderr\": 0.014485656041669185,\n \"acc_norm\": 0.7931034482758621,\n\
\ \"acc_norm_stderr\": 0.014485656041669185\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688225,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688225\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4480446927374302,\n\
\ \"acc_stderr\": 0.016631976628930595,\n \"acc_norm\": 0.4480446927374302,\n\
\ \"acc_norm_stderr\": 0.016631976628930595\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046633,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046633\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.026664410886937617,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.026664410886937617\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409828,\n\
\ \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409828\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45045632333767927,\n\
\ \"acc_stderr\": 0.012707390438502346,\n \"acc_norm\": 0.45045632333767927,\n\
\ \"acc_norm_stderr\": 0.012707390438502346\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.029408372932278746,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.029408372932278746\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6241830065359477,\n \"acc_stderr\": 0.019594021136577443,\n \
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.019594021136577443\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233257,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233257\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4810281517747858,\n\
\ \"mc1_stderr\": 0.017490896405762346,\n \"mc2\": 0.6586281643934825,\n\
\ \"mc2_stderr\": 0.015314973601082471\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.011508957690722759\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4184988627748294,\n \
\ \"acc_stderr\": 0.013588287284030876\n }\n}\n```"
repo_url: https://huggingface.co/LeroyDyer/Mixtral_Uncensored
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|arc:challenge|25_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|gsm8k|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hellaswag|10_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-24-03.824734.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T00-24-03.824734.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- '**/details_harness|winogrande|5_2024-03-22T00-24-03.824734.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T00-24-03.824734.parquet'
- config_name: results
data_files:
- split: 2024_03_22T00_24_03.824734
path:
- results_2024-03-22T00-24-03.824734.parquet
- split: latest
path:
- results_2024-03-22T00-24-03.824734.parquet
---
# Dataset Card for Evaluation run of LeroyDyer/Mixtral_Uncensored
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [LeroyDyer/Mixtral_Uncensored](https://huggingface.co/LeroyDyer/Mixtral_Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LeroyDyer__Mixtral_Uncensored",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T00:24:03.824734](https://huggingface.co/datasets/open-llm-leaderboard/details_LeroyDyer__Mixtral_Uncensored/blob/main/results_2024-03-22T00-24-03.824734.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6194310965439249,
"acc_stderr": 0.03297706356835281,
"acc_norm": 0.6236669140168419,
"acc_norm_stderr": 0.033646817819223775,
"mc1": 0.4810281517747858,
"mc1_stderr": 0.017490896405762346,
"mc2": 0.6586281643934825,
"mc2_stderr": 0.015314973601082471
},
"harness|arc:challenge|25": {
"acc": 0.5998293515358362,
"acc_stderr": 0.014317197787809169,
"acc_norm": 0.6382252559726962,
"acc_norm_stderr": 0.014041957945038085
},
"harness|hellaswag|10": {
"acc": 0.662617008564031,
"acc_stderr": 0.0047185047710837655,
"acc_norm": 0.8406691894045011,
"acc_norm_stderr": 0.003652363253289593
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.02516798233389414,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.02516798233389414
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229865,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229865
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.02541634309630643,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.02541634309630643
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.024756000382130956,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.024756000382130956
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.02931820364520686,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.02931820364520686
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8018348623853211,
"acc_stderr": 0.01709057380421791,
"acc_norm": 0.8018348623853211,
"acc_norm_stderr": 0.01709057380421791
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.040393149787245605,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.040393149787245605
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841403,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7931034482758621,
"acc_stderr": 0.014485656041669185,
"acc_norm": 0.7931034482758621,
"acc_norm_stderr": 0.014485656041669185
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688225,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688225
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4480446927374302,
"acc_stderr": 0.016631976628930595,
"acc_norm": 0.4480446927374302,
"acc_norm_stderr": 0.016631976628930595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046633,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046633
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.026664410886937617,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.026664410886937617
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409828,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409828
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45045632333767927,
"acc_stderr": 0.012707390438502346,
"acc_norm": 0.45045632333767927,
"acc_norm_stderr": 0.012707390438502346
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.625,
"acc_stderr": 0.029408372932278746,
"acc_norm": 0.625,
"acc_norm_stderr": 0.029408372932278746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.019594021136577443,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.019594021136577443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233257,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233257
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4810281517747858,
"mc1_stderr": 0.017490896405762346,
"mc2": 0.6586281643934825,
"mc2_stderr": 0.015314973601082471
},
"harness|winogrande|5": {
"acc": 0.7868981846882399,
"acc_stderr": 0.011508957690722759
},
"harness|gsm8k|5": {
"acc": 0.4184988627748294,
"acc_stderr": 0.013588287284030876
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/65cb2066 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1341
dataset_size: 182
---
# Dataset Card for "65cb2066"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_notstoic__PygmalionCoT-7b | ---
pretty_name: Evaluation run of notstoic/PygmalionCoT-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [notstoic/PygmalionCoT-7b](https://huggingface.co/notstoic/PygmalionCoT-7b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_notstoic__PygmalionCoT-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T15:06:38.792335](https://huggingface.co/datasets/open-llm-leaderboard/details_notstoic__PygmalionCoT-7b/blob/main/results_2023-09-22T15-06-38.792335.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.12111996644295302,\n\
\ \"em_stderr\": 0.0033412757702121106,\n \"f1\": 0.17514471476510068,\n\
\ \"f1_stderr\": 0.0034689450739406216,\n \"acc\": 0.36081482886571287,\n\
\ \"acc_stderr\": 0.00895060187911282\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.12111996644295302,\n \"em_stderr\": 0.0033412757702121106,\n\
\ \"f1\": 0.17514471476510068,\n \"f1_stderr\": 0.0034689450739406216\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.032600454890068235,\n \
\ \"acc_stderr\": 0.004891669021939579\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6890292028413575,\n \"acc_stderr\": 0.01300953473628606\n\
\ }\n}\n```"
repo_url: https://huggingface.co/notstoic/PygmalionCoT-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|arc:challenge|25_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T15_06_38.792335
path:
- '**/details_harness|drop|3_2023-09-22T15-06-38.792335.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T15-06-38.792335.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T15_06_38.792335
path:
- '**/details_harness|gsm8k|5_2023-09-22T15-06-38.792335.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T15-06-38.792335.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hellaswag|10_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T12:24:33.017908.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T12:24:33.017908.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T12:24:33.017908.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T15_06_38.792335
path:
- '**/details_harness|winogrande|5_2023-09-22T15-06-38.792335.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T15-06-38.792335.parquet'
- config_name: results
data_files:
- split: 2023_07_18T12_24_33.017908
path:
- results_2023-07-18T12:24:33.017908.parquet
- split: 2023_09_22T15_06_38.792335
path:
- results_2023-09-22T15-06-38.792335.parquet
- split: latest
path:
- results_2023-09-22T15-06-38.792335.parquet
---
# Dataset Card for Evaluation run of notstoic/PygmalionCoT-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/notstoic/PygmalionCoT-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [notstoic/PygmalionCoT-7b](https://huggingface.co/notstoic/PygmalionCoT-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_notstoic__PygmalionCoT-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T15:06:38.792335](https://huggingface.co/datasets/open-llm-leaderboard/details_notstoic__PygmalionCoT-7b/blob/main/results_2023-09-22T15-06-38.792335.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.12111996644295302,
"em_stderr": 0.0033412757702121106,
"f1": 0.17514471476510068,
"f1_stderr": 0.0034689450739406216,
"acc": 0.36081482886571287,
"acc_stderr": 0.00895060187911282
},
"harness|drop|3": {
"em": 0.12111996644295302,
"em_stderr": 0.0033412757702121106,
"f1": 0.17514471476510068,
"f1_stderr": 0.0034689450739406216
},
"harness|gsm8k|5": {
"acc": 0.032600454890068235,
"acc_stderr": 0.004891669021939579
},
"harness|winogrande|5": {
"acc": 0.6890292028413575,
"acc_stderr": 0.01300953473628606
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
chenglu/hf-blogs-jinaai-embeddings | ---
license: apache-2.0
---
|
fcolt99/oasst1_ro | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 10067699
num_examples: 9317
- name: validation
num_bytes: 381607
num_examples: 348
download_size: 5001235
dataset_size: 10449306
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
edinburghcstr/ami | ---
annotations_creators: []
language:
- en
language_creators: []
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: AMI
size_categories: []
source_datasets: []
tags: []
task_categories:
- automatic-speech-recognition
---
# Dataset Card for AMI
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
- [Terms of Usage](#terms-of-usage)
## Dataset Description
- **Homepage:** https://groups.inf.ed.ac.uk/ami/corpus/
- **Repository:** https://github.com/kaldi-asr/kaldi/tree/master/egs/ami/s5
- **Paper:**
- **Leaderboard:**
- **Point of Contact:** [jonathan@ed.ac.uk](mailto:jonathan@ed.ac.uk)
## Dataset Description
The AMI Meeting Corpus consists of 100 hours of meeting recordings. The recordings use a range of signals
synchronized to a common timeline. These include close-talking and far-field microphones, individual and
room-view video cameras, and output from a slide projector and an electronic whiteboard. During the meetings,
the participants also have unsynchronized pens available to them that record what is written. The meetings
were recorded in English using three different rooms with different acoustic properties, and include mostly
non-native speakers.
**Note**: This dataset corresponds to the data-processing of [KALDI's AMI S5 recipe](https://github.com/kaldi-asr/kaldi/tree/master/egs/ami/s5).
This means text is normalized and the audio data is chunked according to the scripts above!
To make the user experience as simply as possible, we provide the already chunked data to the user here so that the following can be done:
### Example Usage
```python
from datasets import load_dataset
ds = load_dataset("edinburghcstr/ami", "ihm")
print(ds)
```
gives:
```
DatasetDict({
train: Dataset({
features: ['meeting_id', 'audio_id', 'text', 'audio', 'begin_time', 'end_time', 'microphone_id', 'speaker_id'],
num_rows: 108502
})
validation: Dataset({
features: ['meeting_id', 'audio_id', 'text', 'audio', 'begin_time', 'end_time', 'microphone_id', 'speaker_id'],
num_rows: 13098
})
test: Dataset({
features: ['meeting_id', 'audio_id', 'text', 'audio', 'begin_time', 'end_time', 'microphone_id', 'speaker_id'],
num_rows: 12643
})
})
```
```py
ds["train"][0]
```
automatically loads the audio into memory:
```
{'meeting_id': 'EN2001a',
'audio_id': 'AMI_EN2001a_H00_MEE068_0000557_0000594',
'text': 'OKAY',
'audio': {'path': '/cache/dir/path/downloads/extracted/2d75d5b3e8a91f44692e2973f08b4cac53698f92c2567bd43b41d19c313a5280/EN2001a/train_ami_en2001a_h00_mee068_0000557_0000594.wav',
'array': array([0. , 0. , 0. , ..., 0.00033569, 0.00030518,
0.00030518], dtype=float32),
'sampling_rate': 16000},
'begin_time': 5.570000171661377,
'end_time': 5.940000057220459,
'microphone_id': 'H00',
'speaker_id': 'MEE068'}
```
The dataset was tested for correctness by fine-tuning a Wav2Vec2-Large model on it, more explicitly [the `wav2vec2-large-lv60` checkpoint](https://huggingface.co/facebook/wav2vec2-large-lv60).
As can be seen in this experiments, training the model for less than 2 epochs gives
*Result (WER)*:
| "dev" | "eval" |
|---|---|
| 25.27 | 25.21 |
as can be seen [here](https://huggingface.co/patrickvonplaten/ami-wav2vec2-large-lv60).
The results are in-line with results of published papers:
- [*Hybrid acoustic models for distant and multichannel large vocabulary speech recognition*](https://www.researchgate.net/publication/258075865_Hybrid_acoustic_models_for_distant_and_multichannel_large_vocabulary_speech_recognition)
- [Multi-Span Acoustic Modelling using Raw Waveform Signals](https://arxiv.org/abs/1906.11047)
You can run [run.sh](https://huggingface.co/patrickvonplaten/ami-wav2vec2-large-lv60/blob/main/run.sh) to reproduce the result.
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
#### Transcribed Subsets Size
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Citation Information
### Contributions
Thanks to [@sanchit-gandhi](https://github.com/sanchit-gandhi), [@patrickvonplaten](https://github.com/patrickvonplaten),
and [@polinaeterna](https://github.com/polinaeterna) for adding this dataset.
## Terms of Usage
|
bigscience-data/roots_fr_ted_talks_iwslt | ---
language: fr
license: cc-by-nc-nd-4.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_fr_ted_talks_iwslt
# WIT Ted Talks
- Dataset uid: `ted_talks_iwslt`
### Description
The Web Inventory Talk is a collection of the original Ted talks and their translated version. The translations are available in more than 109+ languages, though the distribution is not uniform.
### Homepage
https://github.com/huggingface/datasets/blob/master/datasets/ted_talks_iwslt/README.md
### Licensing
- open license
- cc-by-nc-4.0: Creative Commons Attribution Non Commercial 4.0 International
TED makes its collection of video recordings and transcripts of talks available under the Creative Commons BY-NC-ND license (look here). WIT3 acknowledges the authorship of TED talks (BY condition) and does not redistribute transcripts for commercial purposes (NC). As regards the integrity of the work (ND), WIT3 only changes the format of the container, while preserving the original contents. WIT3 aims to support research on human language processing as well as the diffusion of TED Talks!
### Speaker Locations
- Southern Europe
- Italy
### Sizes
- 0.0305 % of total
- 0.0736 % of ar
- 0.2002 % of pt
- 0.0128 % of zh
- 0.2236 % of vi
- 0.0330 % of fr
- 0.0545 % of es
- 0.0122 % of en
- 0.3704 % of id
- 0.0373 % of indic-hi
- 0.0330 % of indic-ta
- 0.1393 % of indic-mr
- 0.0305 % of ca
- 0.1179 % of indic-ur
- 0.0147 % of indic-bn
- 0.0240 % of indic-ml
- 0.0244 % of indic-te
- 0.0503 % of indic-gu
- 0.0211 % of indic-kn
- 0.0274 % of eu
- 0.0023 % of indic-as
- 0.0001 % of indic-pa
### BigScience processing steps
#### Filters applied to: ar
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: pt
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: zh
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: vi
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: fr
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: es
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: en
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: id
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ta
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: ca
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: indic-ur
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-bn
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-gu
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-kn
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: eu
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-as
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-pa
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
|
mila-intel/ProtST-GeneOntology-CC | ---
configs:
- config_name: default
data_files:
- split: train
path: gene_ontology_cc_train.csv
- split: validation
path: gene_ontology_cc_valid.csv
- split: test
path: gene_ontology_cc_test.csv
--- |
one-sec-cv12/chunk_176 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 24011807904.25
num_examples: 249998
download_size: 22231807340
dataset_size: 24011807904.25
---
# Dataset Card for "chunk_176"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chiragshahcompass/addy | ---
license: artistic-2.0
---
|
open-llm-leaderboard/details_databricks__dolly-v2-12b | ---
pretty_name: Evaluation run of databricks/dolly-v2-12b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [databricks/dolly-v2-12b](https://huggingface.co/databricks/dolly-v2-12b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_databricks__dolly-v2-12b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T05:02:42.236847](https://huggingface.co/datasets/open-llm-leaderboard/details_databricks__dolly-v2-12b/blob/main/results_2023-09-23T05-02-42.236847.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n\
\ \"em_stderr\": 0.0004191330178826844,\n \"f1\": 0.06285968959731549,\n\
\ \"f1_stderr\": 0.0014820300080071475,\n \"acc\": 0.31032723721601535,\n\
\ \"acc_stderr\": 0.008366390657090902\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.0004191330178826844,\n\
\ \"f1\": 0.06285968959731549,\n \"f1_stderr\": 0.0014820300080071475\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.012130401819560273,\n \
\ \"acc_stderr\": 0.0030152942428909495\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6085240726124704,\n \"acc_stderr\": 0.013717487071290854\n\
\ }\n}\n```"
repo_url: https://huggingface.co/databricks/dolly-v2-12b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|arc:challenge|25_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T05_02_42.236847
path:
- '**/details_harness|drop|3_2023-09-23T05-02-42.236847.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T05-02-42.236847.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T05_02_42.236847
path:
- '**/details_harness|gsm8k|5_2023-09-23T05-02-42.236847.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T05-02-42.236847.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hellaswag|10_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T13:43:42.069045.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T13:43:42.069045.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T13:43:42.069045.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T05_02_42.236847
path:
- '**/details_harness|winogrande|5_2023-09-23T05-02-42.236847.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T05-02-42.236847.parquet'
- config_name: results
data_files:
- split: 2023_07_18T13_43_42.069045
path:
- results_2023-07-18T13:43:42.069045.parquet
- split: 2023_09_23T05_02_42.236847
path:
- results_2023-09-23T05-02-42.236847.parquet
- split: latest
path:
- results_2023-09-23T05-02-42.236847.parquet
---
# Dataset Card for Evaluation run of databricks/dolly-v2-12b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/databricks/dolly-v2-12b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [databricks/dolly-v2-12b](https://huggingface.co/databricks/dolly-v2-12b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_databricks__dolly-v2-12b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T05:02:42.236847](https://huggingface.co/datasets/open-llm-leaderboard/details_databricks__dolly-v2-12b/blob/main/results_2023-09-23T05-02-42.236847.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0016778523489932886,
"em_stderr": 0.0004191330178826844,
"f1": 0.06285968959731549,
"f1_stderr": 0.0014820300080071475,
"acc": 0.31032723721601535,
"acc_stderr": 0.008366390657090902
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.0004191330178826844,
"f1": 0.06285968959731549,
"f1_stderr": 0.0014820300080071475
},
"harness|gsm8k|5": {
"acc": 0.012130401819560273,
"acc_stderr": 0.0030152942428909495
},
"harness|winogrande|5": {
"acc": 0.6085240726124704,
"acc_stderr": 0.013717487071290854
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_jondurbin__bagel-dpo-7b-v0.4 | ---
pretty_name: Evaluation run of jondurbin/bagel-dpo-7b-v0.4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/bagel-dpo-7b-v0.4](https://huggingface.co/jondurbin/bagel-dpo-7b-v0.4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__bagel-dpo-7b-v0.4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T12:26:08.289563](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__bagel-dpo-7b-v0.4/blob/main/results_2024-02-09T12-26-08.289563.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6206724451364786,\n\
\ \"acc_stderr\": 0.0329664063441869,\n \"acc_norm\": 0.6242837256806741,\n\
\ \"acc_norm_stderr\": 0.03363029941343461,\n \"mc1\": 0.4749082007343941,\n\
\ \"mc1_stderr\": 0.017481446804104007,\n \"mc2\": 0.6394319602785546,\n\
\ \"mc2_stderr\": 0.01516560925754018\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6493174061433447,\n \"acc_stderr\": 0.013944635930726096,\n\
\ \"acc_norm\": 0.6757679180887372,\n \"acc_norm_stderr\": 0.013678810399518822\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6477793268273252,\n\
\ \"acc_stderr\": 0.00476686090717154,\n \"acc_norm\": 0.8429595698068114,\n\
\ \"acc_norm_stderr\": 0.0036309529998437306\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493857,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493857\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456344,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456344\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997692,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997692\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7548387096774194,\n \"acc_stderr\": 0.02447224384089554,\n \"\
acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.02447224384089554\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.024603626924097417,\n\
\ \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.024603626924097417\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616258,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616258\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059278,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059278\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8348623853211009,\n \"acc_stderr\": 0.01591955782997606,\n \"\
acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.01591955782997606\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5648148148148148,\n \"acc_stderr\": 0.03381200005643525,\n \"\
acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.03381200005643525\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425174,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425174\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847835,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847835\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591205,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591205\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n\
\ \"acc_stderr\": 0.014036945850381394,\n \"acc_norm\": 0.80970625798212,\n\
\ \"acc_norm_stderr\": 0.014036945850381394\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.025816756791584204,\n\
\ \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.025816756791584204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32849162011173183,\n\
\ \"acc_stderr\": 0.01570793539849645,\n \"acc_norm\": 0.32849162011173183,\n\
\ \"acc_norm_stderr\": 0.01570793539849645\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.026925654653615693,\n\
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.026925654653615693\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n\
\ \"acc_stderr\": 0.026858825879488533,\n \"acc_norm\": 0.662379421221865,\n\
\ \"acc_norm_stderr\": 0.026858825879488533\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890162,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890162\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4217731421121252,\n\
\ \"acc_stderr\": 0.012612974369390984,\n \"acc_norm\": 0.4217731421121252,\n\
\ \"acc_norm_stderr\": 0.012612974369390984\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.028064998167040094,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.028064998167040094\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6388888888888888,\n \"acc_stderr\": 0.01943177567703731,\n \
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.01943177567703731\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.02768691358801302,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.02768691358801302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4749082007343941,\n\
\ \"mc1_stderr\": 0.017481446804104007,\n \"mc2\": 0.6394319602785546,\n\
\ \"mc2_stderr\": 0.01516560925754018\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7813733228097869,\n \"acc_stderr\": 0.011616198215773223\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.46853677028051555,\n \
\ \"acc_stderr\": 0.013745189948450417\n }\n}\n```"
repo_url: https://huggingface.co/jondurbin/bagel-dpo-7b-v0.4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|arc:challenge|25_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|gsm8k|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hellaswag|10_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T12-26-08.289563.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T12-26-08.289563.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- '**/details_harness|winogrande|5_2024-02-09T12-26-08.289563.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T12-26-08.289563.parquet'
- config_name: results
data_files:
- split: 2024_02_09T12_26_08.289563
path:
- results_2024-02-09T12-26-08.289563.parquet
- split: latest
path:
- results_2024-02-09T12-26-08.289563.parquet
---
# Dataset Card for Evaluation run of jondurbin/bagel-dpo-7b-v0.4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jondurbin/bagel-dpo-7b-v0.4](https://huggingface.co/jondurbin/bagel-dpo-7b-v0.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__bagel-dpo-7b-v0.4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T12:26:08.289563](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__bagel-dpo-7b-v0.4/blob/main/results_2024-02-09T12-26-08.289563.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6206724451364786,
"acc_stderr": 0.0329664063441869,
"acc_norm": 0.6242837256806741,
"acc_norm_stderr": 0.03363029941343461,
"mc1": 0.4749082007343941,
"mc1_stderr": 0.017481446804104007,
"mc2": 0.6394319602785546,
"mc2_stderr": 0.01516560925754018
},
"harness|arc:challenge|25": {
"acc": 0.6493174061433447,
"acc_stderr": 0.013944635930726096,
"acc_norm": 0.6757679180887372,
"acc_norm_stderr": 0.013678810399518822
},
"harness|hellaswag|10": {
"acc": 0.6477793268273252,
"acc_stderr": 0.00476686090717154,
"acc_norm": 0.8429595698068114,
"acc_norm_stderr": 0.0036309529998437306
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456344,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456344
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997692,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997692
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.02447224384089554,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.02447224384089554
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.024603626924097417,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.024603626924097417
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616258,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616258
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059278,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059278
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.01591955782997606,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.01591955782997606
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847835,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847835
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591205,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591205
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381394,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381394
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.025816756791584204,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.025816756791584204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32849162011173183,
"acc_stderr": 0.01570793539849645,
"acc_norm": 0.32849162011173183,
"acc_norm_stderr": 0.01570793539849645
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.026925654653615693,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.026925654653615693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.026858825879488533,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.026858825879488533
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890162,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890162
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4217731421121252,
"acc_stderr": 0.012612974369390984,
"acc_norm": 0.4217731421121252,
"acc_norm_stderr": 0.012612974369390984
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.028064998167040094,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.028064998167040094
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.01943177567703731,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.01943177567703731
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801302,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4749082007343941,
"mc1_stderr": 0.017481446804104007,
"mc2": 0.6394319602785546,
"mc2_stderr": 0.01516560925754018
},
"harness|winogrande|5": {
"acc": 0.7813733228097869,
"acc_stderr": 0.011616198215773223
},
"harness|gsm8k|5": {
"acc": 0.46853677028051555,
"acc_stderr": 0.013745189948450417
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mcemilg/mkqa_tr | ---
task_categories:
- question-answering
language:
- tr
dataset_info:
features:
- name: example_id
dtype: string
- name: query
dtype: string
- name: answers
dtype: string
---
Homepage: https://huggingface.co/datasets/mkqa |
pumaML/ML-NLP | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: _id
dtype: string
- name: num-step
dtype: int64
- name: prevId
dtype: int64
- name: action
dtype: string
- name: value
dtype: string
- name: testid
dtype: string
splits:
- name: train
num_bytes: 473479.3968887879
num_examples: 3600
- name: validation
num_bytes: 52608.82187653199
num_examples: 400
download_size: 0
dataset_size: 526088.2187653199
---
# Dataset Card for "ML-NLP"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sam2ai/hindi_story_cloze_mini | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
dataset_info:
features:
- name: story_id
dtype: string
- name: input_sentence_1
dtype: string
- name: input_sentence_2
dtype: string
- name: input_sentence_3
dtype: string
- name: input_sentence_4
dtype: string
- name: sentence_quiz1
dtype: string
- name: sentence_quiz2
dtype: string
- name: answer_right_ending
dtype: int32
splits:
- name: train
num_bytes: 39375
num_examples: 50
- name: eval
num_bytes: 39375
num_examples: 50
download_size: 55954
dataset_size: 78750
---
# Dataset Card for "hindi_story_cloze"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
markab/my_validation_dataset | ---
dataset_info:
config_name: markab/coqa_qa_multi
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: answers
list:
- name: answer
dtype: string
- name: question_id
dtype: string
splits:
- name: train
num_bytes: 4736792.767891934
num_examples: 1232
download_size: 2225402
dataset_size: 4736792.767891934
configs:
- config_name: markab/coqa_qa_multi
data_files:
- split: train
path: markab/coqa_qa_multi/train-*
---
|
nlplabtdtu/logic-20 | ---
dataset_info:
features:
- name: question
dtype: string
- name: options
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 7607
num_examples: 19
download_size: 7636
dataset_size: 7607
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "logic-20"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_MiniMoog__Mergerix-7b-v0.5 | ---
pretty_name: Evaluation run of MiniMoog/Mergerix-7b-v0.5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MiniMoog/Mergerix-7b-v0.5](https://huggingface.co/MiniMoog/Mergerix-7b-v0.5)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MiniMoog__Mergerix-7b-v0.5\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-03T14:15:31.512540](https://huggingface.co/datasets/open-llm-leaderboard/details_MiniMoog__Mergerix-7b-v0.5/blob/main/results_2024-04-03T14-15-31.512540.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6508988768810555,\n\
\ \"acc_stderr\": 0.032088264873047896,\n \"acc_norm\": 0.6501115943927149,\n\
\ \"acc_norm_stderr\": 0.032761114936049135,\n \"mc1\": 0.6291309669522643,\n\
\ \"mc1_stderr\": 0.016909693580248835,\n \"mc2\": 0.7807249235143263,\n\
\ \"mc2_stderr\": 0.013698115738515634\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7150170648464164,\n \"acc_stderr\": 0.013191348179838795,\n\
\ \"acc_norm\": 0.7337883959044369,\n \"acc_norm_stderr\": 0.012915774781523198\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7176857199761004,\n\
\ \"acc_stderr\": 0.004492055279407106,\n \"acc_norm\": 0.891256721768572,\n\
\ \"acc_norm_stderr\": 0.0031068060075356337\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723292,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723292\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250447,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250447\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4245810055865922,\n\
\ \"acc_stderr\": 0.016531170993278888,\n \"acc_norm\": 0.4245810055865922,\n\
\ \"acc_norm_stderr\": 0.016531170993278888\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967287,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967287\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n\
\ \"acc_stderr\": 0.012752858346533127,\n \"acc_norm\": 0.47392438070404175,\n\
\ \"acc_norm_stderr\": 0.012752858346533127\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6291309669522643,\n\
\ \"mc1_stderr\": 0.016909693580248835,\n \"mc2\": 0.7807249235143263,\n\
\ \"mc2_stderr\": 0.013698115738515634\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8468823993685872,\n \"acc_stderr\": 0.010120623252272955\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6937073540561031,\n \
\ \"acc_stderr\": 0.01269693010656291\n }\n}\n```"
repo_url: https://huggingface.co/MiniMoog/Mergerix-7b-v0.5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|arc:challenge|25_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|gsm8k|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hellaswag|10_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T14-15-31.512540.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T14-15-31.512540.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- '**/details_harness|winogrande|5_2024-04-03T14-15-31.512540.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-03T14-15-31.512540.parquet'
- config_name: results
data_files:
- split: 2024_04_03T14_15_31.512540
path:
- results_2024-04-03T14-15-31.512540.parquet
- split: latest
path:
- results_2024-04-03T14-15-31.512540.parquet
---
# Dataset Card for Evaluation run of MiniMoog/Mergerix-7b-v0.5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MiniMoog/Mergerix-7b-v0.5](https://huggingface.co/MiniMoog/Mergerix-7b-v0.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MiniMoog__Mergerix-7b-v0.5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-03T14:15:31.512540](https://huggingface.co/datasets/open-llm-leaderboard/details_MiniMoog__Mergerix-7b-v0.5/blob/main/results_2024-04-03T14-15-31.512540.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6508988768810555,
"acc_stderr": 0.032088264873047896,
"acc_norm": 0.6501115943927149,
"acc_norm_stderr": 0.032761114936049135,
"mc1": 0.6291309669522643,
"mc1_stderr": 0.016909693580248835,
"mc2": 0.7807249235143263,
"mc2_stderr": 0.013698115738515634
},
"harness|arc:challenge|25": {
"acc": 0.7150170648464164,
"acc_stderr": 0.013191348179838795,
"acc_norm": 0.7337883959044369,
"acc_norm_stderr": 0.012915774781523198
},
"harness|hellaswag|10": {
"acc": 0.7176857199761004,
"acc_stderr": 0.004492055279407106,
"acc_norm": 0.891256721768572,
"acc_norm_stderr": 0.0031068060075356337
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944433,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723292,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723292
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250447,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250447
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4245810055865922,
"acc_stderr": 0.016531170993278888,
"acc_norm": 0.4245810055865922,
"acc_norm_stderr": 0.016531170993278888
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967287,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967287
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.012752858346533127,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.012752858346533127
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6291309669522643,
"mc1_stderr": 0.016909693580248835,
"mc2": 0.7807249235143263,
"mc2_stderr": 0.013698115738515634
},
"harness|winogrande|5": {
"acc": 0.8468823993685872,
"acc_stderr": 0.010120623252272955
},
"harness|gsm8k|5": {
"acc": 0.6937073540561031,
"acc_stderr": 0.01269693010656291
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
BrainGPT/BrainBench_Human_v0.1.csv | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: doi
dtype: string
- name: journal_section
dtype: string
- name: original_abstract
dtype: string
- name: incorrect_abstract
dtype: string
splits:
- name: train
num_bytes: 636038
num_examples: 200
download_size: 355294
dataset_size: 636038
license: apache-2.0
tags:
- neuroscience
- forward-looking
pretty_name: BrainBench
size_categories:
- n<1K
---
# What is BrainBench?
BrainBench is a forward-looking benchmark for neuroscience. BrainBench evaluates test-takers' ability to predict neuroscience results.
# What is BrainBench made of?
BrainBench's test cases were sourced from recent *Journal of Neuroscience* abstracts across five neuroscience domains:
Behavioral/Cognitive, Systems/Circuits, Neurobiology of Disease, Cellular/Molecular, and Developmental/Plasticity/Repair.
Test-takers chose between the original abstract and one altered to significantly change the result while maintaining coherency.
# How is BrainBench applied?
Human experts and Language Models (LLMs) were tasked with selecting the correct (i.e., original) version from the two options.
Human experts made choices, and provided confidence and expertise ratings in an online study.
LLMs were scored as choosing the abstract with the lower perplexity (i.e., the text passage that was less surprising to the model) and their confidence was proportional to the difference in perplexity between the two options.
***BrainBench_Human_v0.1.csv** was crafted by human experts. |
J00rge/OrquestraCid | ---
license: openrail
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.