datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
henrydz/doc_classify_first_sample | ---
license: apache-2.0
---
|
Mukesh555/indian_lawyer_dataset | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 2409918
num_examples: 1000
download_size: 1030374
dataset_size: 2409918
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vwxyzjn/openhermes-dev-4096-new-tokens__mistralai_Mixtral-8x7B-Instruct-v0.1__1707858724 | ---
dataset_info:
features:
- name: source
dtype: string
- name: category
dtype: string
- name: prompt
dtype: string
- name: candidate0_policy
dtype: string
- name: candidate0
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate1
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate1_policy
dtype: string
splits:
- name: train
num_bytes: 41415067.0
num_examples: 10000
download_size: 22046889
dataset_size: 41415067.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MacLeanLuke/fake-email-campaign | ---
license: openrail
---
|
species_800 | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: species800
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B
'2': I
config_name: species_800
splits:
- name: train
num_bytes: 2579096
num_examples: 5734
- name: validation
num_bytes: 385756
num_examples: 831
- name: test
num_bytes: 737760
num_examples: 1631
download_size: 18204624
dataset_size: 3702612
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [SPECIES](https://species.jensenlab.org/)
- **Repository:**
- **Paper:** https://doi.org/10.1371/journal.pone.0065390
- **Leaderboard:**
- **Point of Contact:** [Lars Juhl Jensen](mailto:lars.juhl.jensen@cpr.ku.dk)
### Dataset Summary
S800 Corpus: a novel abstract-based manually annotated corpus. S800 comprises 800 PubMed abstracts in which organism mentions were identified and mapped to the corresponding NCBI Taxonomy identifiers.
To increase the corpus taxonomic mention diversity the S800 abstracts were collected by selecting 100 abstracts from the following 8 categories: bacteriology, botany, entomology, medicine, mycology, protistology, virology and zoology. S800 has been annotated with a focus at the species level; however, higher taxa mentions (such as genera, families and orders) have also been considered.
The Species-800 dataset was pre-processed and split based on the dataset of Pyysalo (https://github.com/spyysalo/s800).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
English (`en`).
## Dataset Structure
### Data Instances
```
{'id': '0',
'tokens': ['Methanoregula',
'formicica',
'sp',
'.',
'nov',
'.',
',',
'a',
'methane',
'-',
'producing',
'archaeon',
'isolated',
'from',
'methanogenic',
'sludge',
'.'],
'ner_tags': [1, 2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]}
```
### Data Fields
- `id`: Sentence identifier.
- `tokens`: Array of tokens composing a sentence.
- `ner_tags`: Array of tags, where `0` indicates no species mentioned, `1` signals the first token of a species and `2` the subsequent tokens of the species.
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
The species-level S800 corpus is subject to Medline restrictions.
### Citation Information
Original data:
```
@article{pafilis2013species,
title={The SPECIES and ORGANISMS resources for fast and accurate identification of taxonomic names in text},
author={Pafilis, Evangelos and Frankild, Sune P and Fanini, Lucia and Faulwetter, Sarah and Pavloudi, Christina and Vasileiadou, Aikaterini and Arvanitidis, Christos and Jensen, Lars Juhl},
journal={PloS one},
volume={8},
number={6},
pages={e65390},
year={2013},
publisher={Public Library of Science}
}
```
Source data of this dataset:
```
@article{10.1093/bioinformatics/btz682,
author = {Lee, Jinhyuk and Yoon, Wonjin and Kim, Sungdong and Kim, Donghyeon and Kim, Sunkyu and So, Chan Ho and Kang, Jaewoo},
title = "{BioBERT: a pre-trained biomedical language representation model for biomedical text mining}",
journal = {Bioinformatics},
volume = {36},
number = {4},
pages = {1234-1240},
year = {2019},
month = {09},
issn = {1367-4803},
doi = {10.1093/bioinformatics/btz682},
url = {https://doi.org/10.1093/bioinformatics/btz682},
eprint = {https://academic.oup.com/bioinformatics/article-pdf/36/4/1234/48983216/bioinformatics\_36\_4\_1234.pdf},
}
```
and
```
https://github.com/spyysalo/s800
```
### Contributions
Thanks to [@edugp](https://github.com/edugp) for adding this dataset. |
Kabatubare/autotrain-data-1w6s-u4vt-i7yo | ---
dataset_info:
features:
- name: autotrain_text
dtype: string
splits:
- name: train
num_bytes: 19109937
num_examples: 23437
- name: validation
num_bytes: 19109937
num_examples: 23437
download_size: 20605004
dataset_size: 38219874
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "autotrain-data-1w6s-u4vt-i7yo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
emilgoh/verilog-dataset-v3 | ---
license: apache-2.0
---
|
lzhnb/analytic-splatting | ---
license: mit
---
|
pphuc25/uit_data_sample | ---
dataset_info:
features:
- name: id
dtype: string
- name: context
dtype: string
- name: claim
dtype: string
- name: verdict
dtype: string
- name: evidence
dtype: string
- name: domain
dtype: string
splits:
- name: train
num_bytes: 4167523
num_examples: 1000
download_size: 1991987
dataset_size: 4167523
---
# Dataset Card for "uit_data_sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pranjalipathre/img2pose | ---
dataset_info:
- config_name: video_00
features:
- name: original_image
dtype: image
- name: edit_pose
dtype: string
splits:
- name: train
num_bytes: 956869
num_examples: 3267
download_size: 413494839
dataset_size: 956869
- config_name: video_01
features:
- name: original_image
dtype: image
- name: edit_pose
dtype: string
splits:
- name: train
num_bytes: 2605958
num_examples: 9112
download_size: 765552635
dataset_size: 2605958
---
|
Anusha64/Updated-Aeon-dataset | ---
license: mit
dataset_info:
features:
- name: Question
dtype: string
- name: Answer
dtype: string
- name: Content
dtype: string
splits:
- name: train
num_bytes: 40068
num_examples: 21
- name: validation
num_bytes: 9190
num_examples: 5
download_size: 42532
dataset_size: 49258
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
jrajan/support-pages | ---
license: apache-2.0
---
|
SUSTech/wildchat_zh | ---
dataset_info:
features:
- name: conversation_id
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: language
dtype: string
- name: redacted
dtype: bool
- name: role
dtype: string
- name: toxic
dtype: bool
- name: model
dtype: string
splits:
- name: train
num_bytes: 424092565
num_examples: 104301
download_size: 211445530
dataset_size: 424092565
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pythonist/demod | ---
license: apache-2.0
---
|
FabioSantos/autismoDataset | ---
license: mit
---
|
CyberHarem/kashima_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kashima/鹿島 (Kantai Collection)
This is the dataset of kashima/鹿島 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `twintails, grey_hair, wavy_hair, blue_eyes, breasts, long_hair, hat, large_breasts, beret`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 561.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kashima_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 374.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kashima_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1266 | 827.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kashima_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 519.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kashima_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1266 | 1.04 GiB | [Download](https://huggingface.co/datasets/CyberHarem/kashima_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kashima_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, alternate_costume, blush, hair_flower, kimono, floral_print, obi |
| 1 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, solo, upper_body, white_background, white_hair, alternate_costume, blush, dress, simple_background, closed_mouth, hair_between_eyes, hair_ribbon, long_sleeves, short_sleeves, smile |
| 2 | 8 |  |  |  |  |  | 1girl, epaulettes, looking_at_viewer, military_uniform, smile, solo, white_gloves, blush, upper_body, red_neckerchief, simple_background, white_background |
| 3 | 12 |  |  |  |  |  | 1girl, epaulettes, looking_at_viewer, military_uniform, solo, white_gloves, smile, pleated_skirt, simple_background, miniskirt, white_background |
| 4 | 7 |  |  |  |  |  | 1girl, epaulettes, long_sleeves, looking_at_viewer, military_uniform, miniskirt, pleated_skirt, smile, solo, white_gloves, frilled_sleeves, red_neckerchief, simple_background, white_background, blush, jacket |
| 5 | 11 |  |  |  |  |  | 1girl, black_headwear, epaulettes, long_sleeves, looking_at_viewer, military_uniform, red_neckerchief, sidelocks, solo, white_jacket, buttons, military_jacket, simple_background, pleated_skirt, smile, white_background, white_gloves, frilled_sleeves, miniskirt, black_skirt, cowboy_shot, blush, hair_between_eyes, upper_body |
| 6 | 8 |  |  |  |  |  | 1girl, employee_uniform, skirt, smile, solo, open_mouth, blush |
| 7 | 13 |  |  |  |  |  | 1girl, cat_cutout, cat_lingerie, choker, cleavage_cutout, jingle_bell, looking_at_viewer, neck_bell, solo, blush, smile, black_bra, black_panties, underwear_only, cat_ear_panties, side-tie_panties, cat_ears, navel, collarbone, grey_eyes, cat_tail, simple_background |
| 8 | 10 |  |  |  |  |  | 1girl, santa_costume, capelet, christmas, hair_bell, solo, looking_at_viewer, santa_hat, smile, blush, aran_sweater, fur_trim, gift_box, skirt |
| 9 | 13 |  |  |  |  |  | 1girl, blush, cleavage, solo, looking_at_viewer, collarbone, smile, open_mouth, front-tie_top, navel, side-tie_bikini_bottom, outdoors, black_bikini, blue_sky, cloud, day, hair_between_eyes, jacket, ocean, open_clothes, bangs, beach, sidelocks, white_bikini |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | smile | solo | alternate_costume | blush | hair_flower | kimono | floral_print | obi | upper_body | white_background | white_hair | dress | simple_background | closed_mouth | hair_between_eyes | hair_ribbon | long_sleeves | short_sleeves | epaulettes | military_uniform | white_gloves | red_neckerchief | pleated_skirt | miniskirt | frilled_sleeves | jacket | black_headwear | sidelocks | white_jacket | buttons | military_jacket | black_skirt | cowboy_shot | employee_uniform | skirt | open_mouth | cat_cutout | cat_lingerie | choker | cleavage_cutout | jingle_bell | neck_bell | black_bra | black_panties | underwear_only | cat_ear_panties | side-tie_panties | cat_ears | navel | collarbone | grey_eyes | cat_tail | santa_costume | capelet | christmas | hair_bell | santa_hat | aran_sweater | fur_trim | gift_box | cleavage | front-tie_top | side-tie_bikini_bottom | outdoors | black_bikini | blue_sky | cloud | day | ocean | open_clothes | bangs | beach | white_bikini |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-------|:--------------------|:--------|:--------------|:---------|:---------------|:------|:-------------|:-------------------|:-------------|:--------|:--------------------|:---------------|:--------------------|:--------------|:---------------|:----------------|:-------------|:-------------------|:---------------|:------------------|:----------------|:------------|:------------------|:---------|:-----------------|:------------|:---------------|:----------|:------------------|:--------------|:--------------|:-------------------|:--------|:-------------|:-------------|:---------------|:---------|:------------------|:--------------|:------------|:------------|:----------------|:-----------------|:------------------|:-------------------|:-----------|:--------|:-------------|:------------|:-----------|:----------------|:----------|:------------|:------------|:------------|:---------------|:-----------|:-----------|:-----------|:----------------|:-------------------------|:-----------|:---------------|:-----------|:--------|:------|:--------|:---------------|:--------|:--------|:---------------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | X | | X | | | | | X | X | | | X | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 12 |  |  |  |  |  | X | X | X | X | | | | | | | | X | | | X | | | | | | X | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | X | X | X | | X | | | | | | X | | | X | | | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 11 |  |  |  |  |  | X | X | X | X | | X | | | | | X | X | | | X | | X | | X | | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 8 |  |  |  |  |  | X | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 13 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 8 | 10 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 9 | 13 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | X | | | | | | | | | | | X | | X | | | | | | | | X | | | | | | | | | | | | | X | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Am0MuK/md_invoices | ---
language:
- ro
- ru
pretty_name: invoices
--- |
SuperLuigi01/english_train_2k | ---
license: unknown
---
|
316usman/thematic2a | ---
license: bsd
dataset_info:
features:
- name: text
dtype: string
- name: thematic
dtype: string
- name: sub-thematic
dtype: string
- name: country
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
splits:
- name: train
num_bytes: 680072594
num_examples: 871706
download_size: 204712684
dataset_size: 680072594
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Hardik1234/reactjs-train | ---
dataset_info:
features:
- name: path
dtype: string
- name: repo_name
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 1646910413
num_examples: 410387
download_size: 621037694
dataset_size: 1646910413
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
PeterLawrence/connectivity.1d.v3 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 22449
num_examples: 174
download_size: 0
dataset_size: 22449
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "connectivity.1d.v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ewqr2130__mistral-inst-ppo | ---
pretty_name: Evaluation run of ewqr2130/mistral-inst-ppo
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ewqr2130/mistral-inst-ppo](https://huggingface.co/ewqr2130/mistral-inst-ppo)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ewqr2130__mistral-inst-ppo\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T00:39:18.137600](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__mistral-inst-ppo/blob/main/results_2024-01-05T00-39-18.137600.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6072120670121945,\n\
\ \"acc_stderr\": 0.03313666182377149,\n \"acc_norm\": 0.6126266971301495,\n\
\ \"acc_norm_stderr\": 0.03381076261531865,\n \"mc1\": 0.4724602203182375,\n\
\ \"mc1_stderr\": 0.017476930190712187,\n \"mc2\": 0.6229867253601375,\n\
\ \"mc2_stderr\": 0.01576578565924401\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5750853242320819,\n \"acc_stderr\": 0.014445698968520765,\n\
\ \"acc_norm\": 0.6237201365187713,\n \"acc_norm_stderr\": 0.014157022555407154\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6353316072495518,\n\
\ \"acc_stderr\": 0.004803533333364223,\n \"acc_norm\": 0.8320055765783708,\n\
\ \"acc_norm_stderr\": 0.003730972670511862\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n\
\ \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145634,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145634\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36243386243386244,\n \"acc_stderr\": 0.024757473902752056,\n \"\
acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.024757473902752056\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6741935483870968,\n\
\ \"acc_stderr\": 0.026662010578567104,\n \"acc_norm\": 0.6741935483870968,\n\
\ \"acc_norm_stderr\": 0.026662010578567104\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5820512820512821,\n \"acc_stderr\": 0.02500732988246122,\n \
\ \"acc_norm\": 0.5820512820512821,\n \"acc_norm_stderr\": 0.02500732988246122\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566545,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566545\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7944954128440367,\n \"acc_stderr\": 0.01732435232501601,\n \"\
acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.01732435232501601\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159256,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159256\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260594,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260594\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507332,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7841634738186463,\n\
\ \"acc_stderr\": 0.014711684386139963,\n \"acc_norm\": 0.7841634738186463,\n\
\ \"acc_norm_stderr\": 0.014711684386139963\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688225,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688225\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3452513966480447,\n\
\ \"acc_stderr\": 0.015901432608930358,\n \"acc_norm\": 0.3452513966480447,\n\
\ \"acc_norm_stderr\": 0.015901432608930358\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n\
\ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n\
\ \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n\
\ \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.02604176620271716,\n\
\ \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.02604176620271716\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427054,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427054\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42633637548891784,\n\
\ \"acc_stderr\": 0.012630884771599698,\n \"acc_norm\": 0.42633637548891784,\n\
\ \"acc_norm_stderr\": 0.012630884771599698\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.0290294228156814,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.0290294228156814\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6176470588235294,\n \"acc_stderr\": 0.01965992249362335,\n \
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.01965992249362335\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.030965903123573037,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.030965903123573037\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4724602203182375,\n\
\ \"mc1_stderr\": 0.017476930190712187,\n \"mc2\": 0.6229867253601375,\n\
\ \"mc2_stderr\": 0.01576578565924401\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.01183587216483668\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3707354056103108,\n \
\ \"acc_stderr\": 0.013304267705458428\n }\n}\n```"
repo_url: https://huggingface.co/ewqr2130/mistral-inst-ppo
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|arc:challenge|25_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|gsm8k|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hellaswag|10_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-39-18.137600.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T00-39-18.137600.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- '**/details_harness|winogrande|5_2024-01-05T00-39-18.137600.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T00-39-18.137600.parquet'
- config_name: results
data_files:
- split: 2024_01_05T00_39_18.137600
path:
- results_2024-01-05T00-39-18.137600.parquet
- split: latest
path:
- results_2024-01-05T00-39-18.137600.parquet
---
# Dataset Card for Evaluation run of ewqr2130/mistral-inst-ppo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ewqr2130/mistral-inst-ppo](https://huggingface.co/ewqr2130/mistral-inst-ppo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ewqr2130__mistral-inst-ppo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T00:39:18.137600](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__mistral-inst-ppo/blob/main/results_2024-01-05T00-39-18.137600.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6072120670121945,
"acc_stderr": 0.03313666182377149,
"acc_norm": 0.6126266971301495,
"acc_norm_stderr": 0.03381076261531865,
"mc1": 0.4724602203182375,
"mc1_stderr": 0.017476930190712187,
"mc2": 0.6229867253601375,
"mc2_stderr": 0.01576578565924401
},
"harness|arc:challenge|25": {
"acc": 0.5750853242320819,
"acc_stderr": 0.014445698968520765,
"acc_norm": 0.6237201365187713,
"acc_norm_stderr": 0.014157022555407154
},
"harness|hellaswag|10": {
"acc": 0.6353316072495518,
"acc_stderr": 0.004803533333364223,
"acc_norm": 0.8320055765783708,
"acc_norm_stderr": 0.003730972670511862
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145634,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145634
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.04615186962583703,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.04615186962583703
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36243386243386244,
"acc_stderr": 0.024757473902752056,
"acc_norm": 0.36243386243386244,
"acc_norm_stderr": 0.024757473902752056
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6741935483870968,
"acc_stderr": 0.026662010578567104,
"acc_norm": 0.6741935483870968,
"acc_norm_stderr": 0.026662010578567104
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306433,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306433
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5820512820512821,
"acc_stderr": 0.02500732988246122,
"acc_norm": 0.5820512820512821,
"acc_norm_stderr": 0.02500732988246122
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.01732435232501601,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.01732435232501601
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159256,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260594,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260594
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507332,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7841634738186463,
"acc_stderr": 0.014711684386139963,
"acc_norm": 0.7841634738186463,
"acc_norm_stderr": 0.014711684386139963
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688225,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688225
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3452513966480447,
"acc_stderr": 0.015901432608930358,
"acc_norm": 0.3452513966480447,
"acc_norm_stderr": 0.015901432608930358
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.02604176620271716,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.02604176620271716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427054,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427054
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42633637548891784,
"acc_stderr": 0.012630884771599698,
"acc_norm": 0.42633637548891784,
"acc_norm_stderr": 0.012630884771599698
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.0290294228156814,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.0290294228156814
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.01965992249362335,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.01965992249362335
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.030965903123573037,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.030965903123573037
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4724602203182375,
"mc1_stderr": 0.017476930190712187,
"mc2": 0.6229867253601375,
"mc2_stderr": 0.01576578565924401
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.01183587216483668
},
"harness|gsm8k|5": {
"acc": 0.3707354056103108,
"acc_stderr": 0.013304267705458428
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
MuraliGanesan/LayoutLM_Training_dataset | ---
license: afl-3.0
---
|
katielink/med_qa | ---
license: mit
---
|
Rifky/IndonesiaAI-Finetune-Demo | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2676893
num_examples: 325
download_size: 723263
dataset_size: 2676893
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_azarafrooz__mistral-v2-7b-selfplay-v0-test | ---
pretty_name: Evaluation run of azarafrooz/mistral-v2-7b-selfplay-v0-test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [azarafrooz/mistral-v2-7b-selfplay-v0-test](https://huggingface.co/azarafrooz/mistral-v2-7b-selfplay-v0-test)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_azarafrooz__mistral-v2-7b-selfplay-v0-test\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-16T00:39:15.909425](https://huggingface.co/datasets/open-llm-leaderboard/details_azarafrooz__mistral-v2-7b-selfplay-v0-test/blob/main/results_2024-03-16T00-39-15.909425.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6063783478767647,\n\
\ \"acc_stderr\": 0.03315591821100337,\n \"acc_norm\": 0.6108774010374395,\n\
\ \"acc_norm_stderr\": 0.033828819592458446,\n \"mc1\": 0.5299877600979193,\n\
\ \"mc1_stderr\": 0.017471992091697537,\n \"mc2\": 0.6790668311962296,\n\
\ \"mc2_stderr\": 0.015234313921441646\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5853242320819113,\n \"acc_stderr\": 0.014397070564409174,\n\
\ \"acc_norm\": 0.6296928327645052,\n \"acc_norm_stderr\": 0.01411129875167495\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6671977693686517,\n\
\ \"acc_stderr\": 0.004702533775930293,\n \"acc_norm\": 0.848635729934276,\n\
\ \"acc_norm_stderr\": 0.0035767110656195872\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n\
\ \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404948,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404948\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6290322580645161,\n\
\ \"acc_stderr\": 0.027480541887953593,\n \"acc_norm\": 0.6290322580645161,\n\
\ \"acc_norm_stderr\": 0.027480541887953593\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198896,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198896\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306443,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5564102564102564,\n \"acc_stderr\": 0.025189149894764205,\n\
\ \"acc_norm\": 0.5564102564102564,\n \"acc_norm_stderr\": 0.025189149894764205\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114993,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114993\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7908256880733945,\n \"acc_stderr\": 0.017437937173343233,\n \"\
acc_norm\": 0.7908256880733945,\n \"acc_norm_stderr\": 0.017437937173343233\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501964,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501964\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597552,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597552\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n\
\ \"acc_stderr\": 0.014836205167333555,\n \"acc_norm\": 0.7790549169859514,\n\
\ \"acc_norm_stderr\": 0.014836205167333555\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.024883140570071762,\n\
\ \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.024883140570071762\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3106145251396648,\n\
\ \"acc_stderr\": 0.015476515438005567,\n \"acc_norm\": 0.3106145251396648,\n\
\ \"acc_norm_stderr\": 0.015476515438005567\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.026568921015457138,\n\
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.026568921015457138\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.02548311560119545,\n\
\ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.02548311560119545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766006,\n \
\ \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766006\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43285528031290743,\n\
\ \"acc_stderr\": 0.012654565234622866,\n \"acc_norm\": 0.43285528031290743,\n\
\ \"acc_norm_stderr\": 0.012654565234622866\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02952009569768776,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02952009569768776\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6356209150326797,\n \"acc_stderr\": 0.019469518221573705,\n \
\ \"acc_norm\": 0.6356209150326797,\n \"acc_norm_stderr\": 0.019469518221573705\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333047,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333047\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5299877600979193,\n\
\ \"mc1_stderr\": 0.017471992091697537,\n \"mc2\": 0.6790668311962296,\n\
\ \"mc2_stderr\": 0.015234313921441646\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774094\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39727065959059893,\n \
\ \"acc_stderr\": 0.013478659652337792\n }\n}\n```"
repo_url: https://huggingface.co/azarafrooz/mistral-v2-7b-selfplay-v0-test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|arc:challenge|25_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|gsm8k|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hellaswag|10_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T00-39-15.909425.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-16T00-39-15.909425.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- '**/details_harness|winogrande|5_2024-03-16T00-39-15.909425.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-16T00-39-15.909425.parquet'
- config_name: results
data_files:
- split: 2024_03_16T00_39_15.909425
path:
- results_2024-03-16T00-39-15.909425.parquet
- split: latest
path:
- results_2024-03-16T00-39-15.909425.parquet
---
# Dataset Card for Evaluation run of azarafrooz/mistral-v2-7b-selfplay-v0-test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [azarafrooz/mistral-v2-7b-selfplay-v0-test](https://huggingface.co/azarafrooz/mistral-v2-7b-selfplay-v0-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_azarafrooz__mistral-v2-7b-selfplay-v0-test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-16T00:39:15.909425](https://huggingface.co/datasets/open-llm-leaderboard/details_azarafrooz__mistral-v2-7b-selfplay-v0-test/blob/main/results_2024-03-16T00-39-15.909425.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6063783478767647,
"acc_stderr": 0.03315591821100337,
"acc_norm": 0.6108774010374395,
"acc_norm_stderr": 0.033828819592458446,
"mc1": 0.5299877600979193,
"mc1_stderr": 0.017471992091697537,
"mc2": 0.6790668311962296,
"mc2_stderr": 0.015234313921441646
},
"harness|arc:challenge|25": {
"acc": 0.5853242320819113,
"acc_stderr": 0.014397070564409174,
"acc_norm": 0.6296928327645052,
"acc_norm_stderr": 0.01411129875167495
},
"harness|hellaswag|10": {
"acc": 0.6671977693686517,
"acc_stderr": 0.004702533775930293,
"acc_norm": 0.848635729934276,
"acc_norm_stderr": 0.0035767110656195872
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404948,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404948
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.04615186962583703,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.04615186962583703
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155254,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155254
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6290322580645161,
"acc_stderr": 0.027480541887953593,
"acc_norm": 0.6290322580645161,
"acc_norm_stderr": 0.027480541887953593
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198896,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306443,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5564102564102564,
"acc_stderr": 0.025189149894764205,
"acc_norm": 0.5564102564102564,
"acc_norm_stderr": 0.025189149894764205
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114993,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114993
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7908256880733945,
"acc_stderr": 0.017437937173343233,
"acc_norm": 0.7908256880733945,
"acc_norm_stderr": 0.017437937173343233
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501964,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501964
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597552,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597552
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7790549169859514,
"acc_stderr": 0.014836205167333555,
"acc_norm": 0.7790549169859514,
"acc_norm_stderr": 0.014836205167333555
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.024883140570071762,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.024883140570071762
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3106145251396648,
"acc_stderr": 0.015476515438005567,
"acc_norm": 0.3106145251396648,
"acc_norm_stderr": 0.015476515438005567
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.026568921015457138,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.026568921015457138
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.02548311560119545,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.02548311560119545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4432624113475177,
"acc_stderr": 0.029634838473766006,
"acc_norm": 0.4432624113475177,
"acc_norm_stderr": 0.029634838473766006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43285528031290743,
"acc_stderr": 0.012654565234622866,
"acc_norm": 0.43285528031290743,
"acc_norm_stderr": 0.012654565234622866
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.02952009569768776,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.02952009569768776
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6356209150326797,
"acc_stderr": 0.019469518221573705,
"acc_norm": 0.6356209150326797,
"acc_norm_stderr": 0.019469518221573705
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333047,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333047
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5299877600979193,
"mc1_stderr": 0.017471992091697537,
"mc2": 0.6790668311962296,
"mc2_stderr": 0.015234313921441646
},
"harness|winogrande|5": {
"acc": 0.7758484609313339,
"acc_stderr": 0.011720400740774094
},
"harness|gsm8k|5": {
"acc": 0.39727065959059893,
"acc_stderr": 0.013478659652337792
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/ninomiya_asuka_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ninomiya_asuka/二宮飛鳥/니노미야아스카 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of ninomiya_asuka/二宮飛鳥/니노미야아스카 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are `multicolored_hair, two-tone_hair, long_hair, purple_eyes, orange_hair, bangs, hair_between_eyes, breasts, brown_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 648.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ninomiya_asuka_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 389.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ninomiya_asuka_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1235 | 837.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ninomiya_asuka_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 583.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ninomiya_asuka_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1235 | 1.14 GiB | [Download](https://huggingface.co/datasets/CyberHarem/ninomiya_asuka_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ninomiya_asuka_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 25 |  |  |  |  |  | 1girl, solo, collarbone, looking_at_viewer, blush, navel, choker, bracelet, red_hair, small_breasts, cleavage, twin_braids, black_bikini, medium_breasts, white_background, simple_background, smile, cowboy_shot, open_mouth, pink_hair |
| 1 | 5 |  |  |  |  |  | 1girl, choker, looking_at_viewer, solo, collarbone, purple_hair, simple_background, upper_body, white_background, long_sleeves, open_mouth, shiny_hair, :d, ahoge, black_shirt, blush, necklace, plaid, sketch |
| 2 | 6 |  |  |  |  |  | 1girl, blue_hair, long_sleeves, shiny_hair, solo, very_long_hair, white_shirt, blue_skirt, frills, looking_at_viewer, miniskirt, underbust, dress_shirt, layered_skirt, black_thighhighs, blush, hair_flower, on_back, simple_background, white_ascot, white_background, zettai_ryouiki |
| 3 | 15 |  |  |  |  |  | 1girl, solo, looking_at_viewer, skirt, thighhighs, smile, beret, braid, detached_sleeves, feathers, necktie, pink_hair |
| 4 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, solo, fingerless_gloves, hood_up, red_hair, midriff, choker, elbow_gloves, navel, red_cape, smile, braid, hooded_cloak, red_cloak, closed_mouth, nail_polish, red_skirt, belt, black_gloves, chain, holding, miniskirt, small_breasts, standing, sword |
| 5 | 13 |  |  |  |  |  | enmaided, wrist_cuffs, 1girl, blush, cat_ears, solo, black_dress, looking_at_viewer, neck_ribbon, white_apron, black_ribbon, blonde_hair, puffy_short_sleeves, simple_background, waist_apron, frilled_apron, fake_animal_ears, white_background, white_thighhighs, closed_mouth, detached_collar, small_breasts, maid_apron, smile, zettai_ryouiki |
| 6 | 15 |  |  |  |  |  | 1girl, blush, nipples, small_breasts, collarbone, open_mouth, 1boy, completely_nude, hetero, simple_background, white_background, looking_at_viewer, navel, solo_focus, sweat, mosaic_censoring |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | collarbone | looking_at_viewer | blush | navel | choker | bracelet | red_hair | small_breasts | cleavage | twin_braids | black_bikini | medium_breasts | white_background | simple_background | smile | cowboy_shot | open_mouth | pink_hair | purple_hair | upper_body | long_sleeves | shiny_hair | :d | ahoge | black_shirt | necklace | plaid | sketch | blue_hair | very_long_hair | white_shirt | blue_skirt | frills | miniskirt | underbust | dress_shirt | layered_skirt | black_thighhighs | hair_flower | on_back | white_ascot | zettai_ryouiki | skirt | thighhighs | beret | braid | detached_sleeves | feathers | necktie | fingerless_gloves | hood_up | midriff | elbow_gloves | red_cape | hooded_cloak | red_cloak | closed_mouth | nail_polish | red_skirt | belt | black_gloves | chain | holding | standing | sword | enmaided | wrist_cuffs | cat_ears | black_dress | neck_ribbon | white_apron | black_ribbon | blonde_hair | puffy_short_sleeves | waist_apron | frilled_apron | fake_animal_ears | white_thighhighs | detached_collar | maid_apron | nipples | 1boy | completely_nude | hetero | solo_focus | sweat | mosaic_censoring |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------------|:--------------------|:--------|:--------|:---------|:-----------|:-----------|:----------------|:-----------|:--------------|:---------------|:-----------------|:-------------------|:--------------------|:--------|:--------------|:-------------|:------------|:--------------|:-------------|:---------------|:-------------|:-----|:--------|:--------------|:-----------|:--------|:---------|:------------|:-----------------|:--------------|:-------------|:---------|:------------|:------------|:--------------|:----------------|:-------------------|:--------------|:----------|:--------------|:-----------------|:--------|:-------------|:--------|:--------|:-------------------|:-----------|:----------|:--------------------|:----------|:----------|:---------------|:-----------|:---------------|:------------|:---------------|:--------------|:------------|:-------|:---------------|:--------|:----------|:-----------|:--------|:-----------|:--------------|:-----------|:--------------|:--------------|:--------------|:---------------|:--------------|:----------------------|:--------------|:----------------|:-------------------|:-------------------|:------------------|:-------------|:----------|:-------|:------------------|:---------|:-------------|:--------|:-------------------|
| 0 | 25 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | X | | | | | | | | X | X | | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | | X | X | | | | | | | | | | X | X | | | | | | | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 15 |  |  |  |  |  | X | X | | X | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | X | | X | | X | X | | X | X | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 13 |  |  |  |  |  | X | X | | X | X | | | | | X | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 6 | 15 |  |  |  |  |  | X | | X | X | X | X | | | | X | | | | | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X |
|
nlpso/m0_qualitative_analysis_ocr_cmbert_io | ---
language:
- fr
multilinguality:
- monolingual
task_categories:
- token-classification
---
# m0_qualitative_analysis_ocr_cmbert_io
## Introduction
This dataset was used to perform **qualitative analysis** of [Jean-Baptiste/camembert-ner](https://huggingface.co/Jean-Baptiste/camembert-ner) on **flat NER task** using Flat NER approach [M0].
It contains 19th-century Paris trade directories' entries.
## Dataset parameters
* Approach : M0
* Dataset type : noisy (Pero OCR)
* Tokenizer : [Jean-Baptiste/camembert-ner](https://huggingface.co/Jean-Baptiste/camembert-ner)
* Tagging format : IO
* Counts :
* Train : 6084
* Dev : 676
* Test : 1685
* Associated fine-tuned model : [nlpso/m0_flat_ner_ocr_cmbert_io](https://huggingface.co/nlpso/m0_flat_ner_ocr_cmbert_io)
## Entity types
Abbreviation|Description
-|-
O |Outside of a named entity
PER |Person or company name
ACT |Person or company professional activity
TITRE |Distinction
LOC |Street name
CARDINAL |Street number
FT |Geographical feature
## How to use this dataset
```python
from datasets import load_dataset
train_dev_test = load_dataset("nlpso/m0_qualitative_analysis_ocr_cmbert_io")
|
liuyanchen1015/MULTI_VALUE_mrpc_definite_abstract | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 79150
num_examples: 270
- name: train
num_bytes: 167090
num_examples: 559
- name: validation
num_bytes: 18410
num_examples: 62
download_size: 181322
dataset_size: 264650
---
# Dataset Card for "MULTI_VALUE_mrpc_definite_abstract"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_BioMistral__BioMistral-7B-DARE | ---
pretty_name: Evaluation run of BioMistral/BioMistral-7B-DARE
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BioMistral/BioMistral-7B-DARE](https://huggingface.co/BioMistral/BioMistral-7B-DARE)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BioMistral__BioMistral-7B-DARE\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-18T15:00:58.166641](https://huggingface.co/datasets/open-llm-leaderboard/details_BioMistral__BioMistral-7B-DARE/blob/main/results_2024-02-18T15-00-58.166641.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.569618659780021,\n\
\ \"acc_stderr\": 0.033613488588484064,\n \"acc_norm\": 0.5773660633662913,\n\
\ \"acc_norm_stderr\": 0.03436581187175652,\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.01711581563241819,\n \"mc2\": 0.5560965695589573,\n\
\ \"mc2_stderr\": 0.01537026760670331\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5469283276450512,\n \"acc_stderr\": 0.014546892052005628,\n\
\ \"acc_norm\": 0.5827645051194539,\n \"acc_norm_stderr\": 0.014409825518403079\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6057558255327624,\n\
\ \"acc_stderr\": 0.004876889983110832,\n \"acc_norm\": 0.7987452698665605,\n\
\ \"acc_norm_stderr\": 0.0040011857585710445\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n\
\ \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.5838150289017341,\n \"acc_stderr\": 0.03758517775404947,\n\
\ \"acc_norm\": 0.5838150289017341,\n \"acc_norm_stderr\": 0.03758517775404947\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n\
\ \"acc_stderr\": 0.04724007352383888,\n \"acc_norm\": 0.3431372549019608,\n\
\ \"acc_norm_stderr\": 0.04724007352383888\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4808510638297872,\n\
\ \"acc_stderr\": 0.03266204299064678,\n \"acc_norm\": 0.4808510638297872,\n\
\ \"acc_norm_stderr\": 0.03266204299064678\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.04462917535336936,\n\
\ \"acc_norm\": 0.34210526315789475,\n \"acc_norm_stderr\": 0.04462917535336936\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"\
acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520193,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520193\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n\
\ \"acc_stderr\": 0.026923446059302844,\n \"acc_norm\": 0.6612903225806451,\n\
\ \"acc_norm_stderr\": 0.026923446059302844\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.03095405547036589,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.03095405547036589\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.03027690994517826,\n\
\ \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.03027690994517826\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.025069094387296532,\n\
\ \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.025069094387296532\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.0322529423239964,\n \
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7394495412844037,\n \"acc_stderr\": 0.01881918203485007,\n \"\
acc_norm\": 0.7394495412844037,\n \"acc_norm_stderr\": 0.01881918203485007\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7107843137254902,\n \"acc_stderr\": 0.031822318676475524,\n \"\
acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.031822318676475524\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7088607594936709,\n \"acc_stderr\": 0.02957160106575337,\n \
\ \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.02957160106575337\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.024662496845209818,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.024662496845209818\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7535121328224776,\n\
\ \"acc_stderr\": 0.015411308769686929,\n \"acc_norm\": 0.7535121328224776,\n\
\ \"acc_norm_stderr\": 0.015411308769686929\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6329479768786127,\n \"acc_stderr\": 0.025950054337654075,\n\
\ \"acc_norm\": 0.6329479768786127,\n \"acc_norm_stderr\": 0.025950054337654075\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249603,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249603\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.027121956071388852,\n\
\ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.027121956071388852\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6205787781350482,\n\
\ \"acc_stderr\": 0.027559949802347817,\n \"acc_norm\": 0.6205787781350482,\n\
\ \"acc_norm_stderr\": 0.027559949802347817\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271146,\n\
\ \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271146\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.02883892147125145,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.02883892147125145\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39308996088657105,\n\
\ \"acc_stderr\": 0.012474899613873956,\n \"acc_norm\": 0.39308996088657105,\n\
\ \"acc_norm_stderr\": 0.012474899613873956\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5698529411764706,\n \"acc_stderr\": 0.030074971917302875,\n\
\ \"acc_norm\": 0.5698529411764706,\n \"acc_norm_stderr\": 0.030074971917302875\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5866013071895425,\n \"acc_stderr\": 0.019922115682786692,\n \
\ \"acc_norm\": 0.5866013071895425,\n \"acc_norm_stderr\": 0.019922115682786692\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6693877551020408,\n \"acc_stderr\": 0.030116426296540603,\n\
\ \"acc_norm\": 0.6693877551020408,\n \"acc_norm_stderr\": 0.030116426296540603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n\
\ \"acc_stderr\": 0.029705284056772436,\n \"acc_norm\": 0.7711442786069652,\n\
\ \"acc_norm_stderr\": 0.029705284056772436\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.01711581563241819,\n \"mc2\": 0.5560965695589573,\n\
\ \"mc2_stderr\": 0.01537026760670331\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.760852407261247,\n \"acc_stderr\": 0.011988541844843909\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15011372251705837,\n \
\ \"acc_stderr\": 0.009838590860906968\n }\n}\n```"
repo_url: https://huggingface.co/BioMistral/BioMistral-7B-DARE
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|arc:challenge|25_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|gsm8k|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hellaswag|10_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T15-00-58.166641.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T15-00-58.166641.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- '**/details_harness|winogrande|5_2024-02-18T15-00-58.166641.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-18T15-00-58.166641.parquet'
- config_name: results
data_files:
- split: 2024_02_18T15_00_58.166641
path:
- results_2024-02-18T15-00-58.166641.parquet
- split: latest
path:
- results_2024-02-18T15-00-58.166641.parquet
---
# Dataset Card for Evaluation run of BioMistral/BioMistral-7B-DARE
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BioMistral/BioMistral-7B-DARE](https://huggingface.co/BioMistral/BioMistral-7B-DARE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BioMistral__BioMistral-7B-DARE",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-18T15:00:58.166641](https://huggingface.co/datasets/open-llm-leaderboard/details_BioMistral__BioMistral-7B-DARE/blob/main/results_2024-02-18T15-00-58.166641.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.569618659780021,
"acc_stderr": 0.033613488588484064,
"acc_norm": 0.5773660633662913,
"acc_norm_stderr": 0.03436581187175652,
"mc1": 0.3953488372093023,
"mc1_stderr": 0.01711581563241819,
"mc2": 0.5560965695589573,
"mc2_stderr": 0.01537026760670331
},
"harness|arc:challenge|25": {
"acc": 0.5469283276450512,
"acc_stderr": 0.014546892052005628,
"acc_norm": 0.5827645051194539,
"acc_norm_stderr": 0.014409825518403079
},
"harness|hellaswag|10": {
"acc": 0.6057558255327624,
"acc_stderr": 0.004876889983110832,
"acc_norm": 0.7987452698665605,
"acc_norm_stderr": 0.0040011857585710445
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4808510638297872,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.4808510638297872,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520193,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520193
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6612903225806451,
"acc_stderr": 0.026923446059302844,
"acc_norm": 0.6612903225806451,
"acc_norm_stderr": 0.026923446059302844
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.03095405547036589,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.03095405547036589
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.772020725388601,
"acc_stderr": 0.03027690994517826,
"acc_norm": 0.772020725388601,
"acc_norm_stderr": 0.03027690994517826
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5743589743589743,
"acc_stderr": 0.025069094387296532,
"acc_norm": 0.5743589743589743,
"acc_norm_stderr": 0.025069094387296532
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7394495412844037,
"acc_stderr": 0.01881918203485007,
"acc_norm": 0.7394495412844037,
"acc_norm_stderr": 0.01881918203485007
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7107843137254902,
"acc_stderr": 0.031822318676475524,
"acc_norm": 0.7107843137254902,
"acc_norm_stderr": 0.031822318676475524
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7088607594936709,
"acc_stderr": 0.02957160106575337,
"acc_norm": 0.7088607594936709,
"acc_norm_stderr": 0.02957160106575337
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.024662496845209818,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.024662496845209818
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7535121328224776,
"acc_stderr": 0.015411308769686929,
"acc_norm": 0.7535121328224776,
"acc_norm_stderr": 0.015411308769686929
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6329479768786127,
"acc_stderr": 0.025950054337654075,
"acc_norm": 0.6329479768786127,
"acc_norm_stderr": 0.025950054337654075
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249603,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249603
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.027121956071388852,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.027121956071388852
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6205787781350482,
"acc_stderr": 0.027559949802347817,
"acc_norm": 0.6205787781350482,
"acc_norm_stderr": 0.027559949802347817
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6080246913580247,
"acc_stderr": 0.027163686038271146,
"acc_norm": 0.6080246913580247,
"acc_norm_stderr": 0.027163686038271146
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.02883892147125145,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.02883892147125145
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39308996088657105,
"acc_stderr": 0.012474899613873956,
"acc_norm": 0.39308996088657105,
"acc_norm_stderr": 0.012474899613873956
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5698529411764706,
"acc_stderr": 0.030074971917302875,
"acc_norm": 0.5698529411764706,
"acc_norm_stderr": 0.030074971917302875
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5866013071895425,
"acc_stderr": 0.019922115682786692,
"acc_norm": 0.5866013071895425,
"acc_norm_stderr": 0.019922115682786692
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6693877551020408,
"acc_stderr": 0.030116426296540603,
"acc_norm": 0.6693877551020408,
"acc_norm_stderr": 0.030116426296540603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.029705284056772436,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.029705284056772436
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3953488372093023,
"mc1_stderr": 0.01711581563241819,
"mc2": 0.5560965695589573,
"mc2_stderr": 0.01537026760670331
},
"harness|winogrande|5": {
"acc": 0.760852407261247,
"acc_stderr": 0.011988541844843909
},
"harness|gsm8k|5": {
"acc": 0.15011372251705837,
"acc_stderr": 0.009838590860906968
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
seongcho/generadai-sample | ---
dataset_info:
features:
- name: name
dtype: string
- name: description
dtype: string
- name: ad
dtype: string
splits:
- name: train
num_bytes: 2415
num_examples: 5
download_size: 6256
dataset_size: 2415
---
# Dataset Card for "generadai-sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mmajbaig/StateBankPakistanDataset | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 33935
num_examples: 180
download_size: 16873
dataset_size: 33935
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mtc/german_seahorse_dataset_with_articles | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: gem_id
dtype: string
- name: worker_lang
dtype: string
- name: summary
dtype: string
- name: model
dtype: string
- name: question1
dtype: string
- name: question2
dtype: string
- name: question3
dtype: string
- name: question4
dtype: string
- name: question5
dtype: string
- name: question6
dtype: string
- name: article
dtype: string
splits:
- name: test
num_bytes: 9444778
num_examples: 2685
- name: train
num_bytes: 32022408
num_examples: 9180
- name: validation
num_bytes: 4677669
num_examples: 1373
download_size: 14666995
dataset_size: 46144855
---
# Dataset Card for "german_seahorse_dataset_with_articles"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Merim/voicesmodels | ---
license: openrail
---
|
nc33/qna_sbert | ---
license: mit
---
|
arbml/CIDAR-MCQ-100 | ---
language:
- ar
license: apache-2.0
size_categories:
- n<1K
task_categories:
- multiple-choice
pretty_name: 'CIDAR-MCQ-100 '
dataset_info:
features:
- name: Question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 18899
num_examples: 100
download_size: 13287
dataset_size: 18899
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "CIDAR-MCQ-100"
# CIDAR-MCQ-100
CIDAR-MCQ-100 contains **100** multiple-choice questions and answers about the Arabic culture.
## 📚 Datasets Summary
<table>
<tr>
<th>Name</th>
<th>Explanation</th>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/arbml/cidar>CIDAR</a></t>
<td>10,000 instructions and responses in Arabic</td>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/arbml/cidar-eval-100>CIDAR-EVAL-100</a></t>
<td>100 instructions to evaluate LLMs on cultural relevance</td>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/arbml/cidar-mcq-100><b>CIDAR-MCQ-100</b></a></t>
<td>100 Multiple choice questions and answers to evaluate LLMs on cultural relevance </td>
</tr>
</table>
<div width="30px" align="center">
| Category | CIDAR-EVAL-100 | <a href=https://huggingface.co/datasets/arbml/cidar-mcq-100><b>CIDAR-MCQ-100</b></a>|
|----------|:-------------:|:------:|
|Food&Drinks | 14 | 8 |
|Names | 14 | 8 |
|Animals | 2 | 4 |
|Language | 10 | 20 |
|Jokes&Puzzles | 3 | 7 |
|Religion | 5 | 10 |
|Business | 6 | 7 |
|Cloths | 4 | 5 |
|Science | 3 | 4 |
|Sports&Games | 4 | 2 |
|Tradition | 4 | 10 |
|Weather | 4 | 2 |
|Geography | 7 | 8 |
|General | 4 | 3 |
|Fonts | 5 | 2 |
|Literature | 10 | 2 |
|Plants | 3 | 0 |
<i>Total</i> | 100 | 100 |
</div>
## 📋 Dataset Structure
- `Question(str)`: Question about the Arabic culture.
- `A(str)`: First choice.
- `B(str)`: Second choice.
- `C(str)`: Third choice.
- `D(str)`: Fourth choice.
- `answer(str)`: The correct choice from A,B,C, and D.
## 📁 Loading The Dataset
You can download the dataset directly from HuggingFace or use the following code:
```python
from datasets import load_dataset
cidar = load_dataset('arbml/CIDAR-MCQ-100')
```
## 📄 Sample From The Dataset:
**Question**: حدد حيوان مشهور في المنطقة
**A**: الجمل
**B**: اللاما
**C**: الكانغرو
**D**: الدب القطبي
**answer**: A
## 🔑 License
The dataset is licensed under **Apache-2.0**. [Apache-2.0](https://www.apache.org/licenses/LICENSE-2.0).
## Citation
```
@misc{alyafeai2024cidar,
title={{CIDAR: Culturally Relevant Instruction Dataset For Arabic}},
author={Zaid Alyafeai and Khalid Almubarak and Ahmed Ashraf and Deema Alnuhait and Saied Alshahrani and Gubran A. Q. Abdulrahman and Gamil Ahmed and Qais Gawah and Zead Saleh and Mustafa Ghaleb and Yousef Ali and Maged S. Al-Shaibani},
year={2024},
eprint={2402.03177},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
shidowake/FreedomIntelligence_alpaca-gpt4-japanese_subset_split_9 | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 4862244.09533911
num_examples: 4996
download_size: 2555878
dataset_size: 4862244.09533911
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hlillemark/c4_t5_pretrain | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence: int64
splits:
- name: validation
num_bytes: 53400000
num_examples: 10000
- name: train
num_bytes: 961505597520
num_examples: 180057228
download_size: 2939856140
dataset_size: 961558997520
---
# Dataset Card for "c4_t5_pretrain"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kheopss/template_prompt_hermes | ---
dataset_info:
features:
- name: input
dtype: string
- name: response
dtype: string
- name: text
dtype: string
- name: text2
dtype: string
- name: instruction
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 19498820
num_examples: 1960
download_size: 6851935
dataset_size: 19498820
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kimgahyeon/text | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 88225938
num_examples: 60260
download_size: 15196617
dataset_size: 88225938
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Mitsuki-Sakamoto/alpaca_farm-deberta-re-preference-64-nsample-16_filter_gold_thr_0.1_self_70m | ---
dataset_info:
config_name: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: index
dtype: int64
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43497682
num_examples: 18928
- name: epoch_1
num_bytes: 44347279
num_examples: 18928
- name: epoch_2
num_bytes: 44401312
num_examples: 18928
- name: epoch_3
num_bytes: 44445720
num_examples: 18928
- name: epoch_4
num_bytes: 44462319
num_examples: 18928
- name: epoch_5
num_bytes: 44472553
num_examples: 18928
- name: epoch_6
num_bytes: 44479624
num_examples: 18928
- name: epoch_7
num_bytes: 44489464
num_examples: 18928
- name: epoch_8
num_bytes: 44488981
num_examples: 18928
- name: epoch_9
num_bytes: 44495780
num_examples: 18928
download_size: 958119263
dataset_size: 443580714
configs:
- config_name: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_1
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_0
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_3
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
---
|
tinhpx2911/vietnamese_book_10k | ---
dataset_info:
features:
- name: text
dtype: string
- name: name
dtype: string
splits:
- name: train
num_bytes: 1607495469
num_examples: 9961
download_size: 844824154
dataset_size: 1607495469
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "10kvnbook"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
loubnabnl/ada-no-pii_checks | ---
dataset_info:
features:
- name: entities
list:
- name: context
dtype: string
- name: end
dtype: int64
- name: score
dtype: float32
- name: start
dtype: int64
- name: tag
dtype: string
- name: value
dtype: string
- name: max_stars_repo_path
dtype: string
- name: max_stars_repo_name
dtype: string
- name: max_stars_count
dtype: int64
- name: content
dtype: string
- name: id
dtype: string
- name: new_content
dtype: string
- name: modified
dtype: bool
- name: references
dtype: string
splits:
- name: train
num_bytes: 276915088.37363416
num_examples: 10886
download_size: 100410446
dataset_size: 276915088.37363416
---
# Dataset Card for "ada-no-pii_checks"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MRAIRR/news_summarization | ---
license: apache-2.0
---
|
Tamazight-NLP/AmaWar | ---
configs:
- config_name: examples
data_files: examples.tsv
sep: "\t"
default: true
- config_name: expressions
data_files: expressions.tsv
sep: "\t"
- config_name: proverbs
data_files: proverbs.tsv
sep: "\t"
- config_name: riddles
data_files: riddles.tsv
sep: "\t"
- config_name: stories
data_files: "stories/*.tsv"
sep: "\t"
- config_name: poems
data_files: "poems/*.tsv"
sep: "\t"
task_categories:
- translation
- text2text-generation
language:
- ber
- tzm
- ar
pretty_name: Amawal Warayni
size_categories:
- 1K<n<10K
---
# Amawal Warayni
Bitext scraped from the online [AmaWar](https://amawalwarayni.com/) dictionary of the Tamazight dialect of Ait Warain spoken in northeastern Morocco.
Contains sentences, stories, and poems in Tamazight along with their translations into Modern Standard Arabic.
Big thanks to Dr. Noureddine Amhaoui for his amazing work.
# Citation
```
نور الدين أمهاوي. (2021). معجم محوسب لمعاني الأسماء والأفعال الأمازيغية الوارينية أمازيغي-عربي.
تاريخ الاسترداد 15 11، 2023، من https://amawalwarayni.com/
```
|
zaixin/1 | ---
license: apache-2.0
---
|
loubnabnl/ada_key_merge_subset | ---
dataset_info:
features:
- name: entities
list:
- name: context
dtype: string
- name: end
dtype: int64
- name: score
dtype: float32
- name: start
dtype: int64
- name: tag
dtype: string
- name: value
dtype: string
- name: max_stars_repo_path
dtype: string
- name: max_stars_repo_name
dtype: string
- name: max_stars_count
dtype: int64
- name: content
dtype: string
- name: id
dtype: string
- name: new_content
dtype: string
- name: modified
dtype: bool
- name: references
dtype: string
- name: fixed_content
sequence: string
splits:
- name: train
num_bytes: 54890027
num_examples: 580
download_size: 7819078
dataset_size: 54890027
---
# Dataset Card for "ada_key_merge_subset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BreadboardLabs/CurioTreeData | ---
license: cc-by-nc-4.0
tags:
- climate
- trees
- images
size_categories:
- 1M<n<10M
---
# The Curio Tree Dataset
This dataset contains much of the tree inventory, images and stories data that was collected on the [Curio platform](https://www.youtube.com/@curio-xyz7991/videos) before it was sunset. The data was extraced from a number of database tables and includes;
- The inventory details of 2.5 millions trees from locations across the globe (location, species, diameter at breast height (DBH), height, vitality etc, where available)
- 27,288 images of trees that were uploaded onto the platform by our community and linked to individual trees and their species information etc.
- Notes (stories), tags and conversations linked to trees.
### Dataset Description
Curio was an environmental education and outreach platform that was predominantly focused on urban forestry. It connected the various stakeholders involved in the management of urban forestry with the public and importantly made all data uploaded via its web and mobile apps publicly available. The platform was live from March 2016 until August 2023 when the maintainence overheads made its ongoing availability infeasible. Curio was supported in its early stages by two European Space Agency projects, through the [New Commons](https://business.esa.int/projects/new-commons) and [Curio Canopy](https://business.esa.int/projects/curio-canopy). A sense of the platform and how it worked can be found via the videos on its supporting [youtube channel](https://www.youtube.com/@curio-xyz7991/videos)
This repository contains much of the tree inventory, images and stories data that was collected on the platform via our community, projects we helped support and open data tree inventories we uploaded onto the platform. We are keen to make this data available for research purposes in the hope it might be of benefit to others and to further the efforts of our community.
We have endeavored to name as many of those great projects and data sources that were hosted on the Curio platform in the attribution section below. If there are any omissions or errors please contact us.
A related project involved generating a high resolution map of tree canopy cover for the Greater London Authority. Details of that project and dataset can be found on the [London Datastore Curio Canopy page](https://data.london.gov.uk/dataset/curio-canopy).
- **Curated by:** Breadboard Labs
- **License:** cc-by-nc-4.0
### Dataset Sources and Attribution
Many people picked up the app and contributed to the data that was collected. Curio was also used to support many great projects and initiatives. We have endeavoured to mention many of those projects below along with the open data tree inventories we uploaded onto the platform.
#### Collaborative projects supported by Curio
- [Morton Arboretum](https://mortonarb.org/) - [Chicago Regional Tree Initiative](https://chicagorti.org/programs/)
- [Dublin City Council’s Parks, Biodiversity and Landscape Services](https://www.dublincity.ie/residential/parks) & [School of Geography at University College Dublin](https://www.ucd.ie/geography) - [Tree Mapping Dublin](https://mappinggreendublin.com/)
- [Sacramento Tree Foundation](https://sactree.org/) - [Save the Elms Program](https://sactree.org/programs/monitoring-elms/)
- [Cambridge City Council](https://www.cambridge.gov.uk/) - [Cambridge City Canopy Programme](https://www.cambridge.gov.uk/cambridge-canopy-project)
- [Municipality of Oslo Agency for Urban Environment](https://www.visitoslo.com/en/product/?tlp=593685) - Inventory and ecosystem services report hosting
- [Friends of Brunswick Park](http://www.friendsofbrunswickpark.co.uk/)
- [Exeter Trees](www.exetertrees.uk)
- [Wembley Park Limited](https://wembleypark.com/)
- [Washington Square Park Eco Projects](https://www.wspecoprojects.org/)
- [Coláiste Bríde Enniscorthy](https://www.colaistebride.ie/)
- [Enniscorthy Vocational College](https://www.enniscorthycc.ie/)
- [Mountshannon Arboretum](https://www.mountshannonarboretum.com/) - Forester Bernard Carey initiated the Mountshannon i-Tree project, in conjunction with UCD and UK-based consultancy Treeconomics.
- [Sidmouth Arboretum](http://sidmoutharboretum.org.uk/)
- [East Devon District Council](https://eastdevon.gov.uk/)
- [SLU](https://www.slu.se/en/) - Alnarp - Skåne Tree Inventory and support for and involvement in the New Commons and Curio Canopy projects
- [Malmö Stad](https://malmo.se/) - Malmö Tree Inventory and support for and involvement in the New Commons and Curio Canopy projects
- [Göteborgs Stad](https://goteborg.se/) -
- [Halmstad](https://www.halmstad.se/)
- [Hvilan](https://www.hvilanutbildning.se/)
- [Familjebostader](https://familjebostader.com/om-oss/)
#### Open Data Sources Attribution
- The Greater London Authority Datastore - [Local Authority Maintained Trees](https://data.london.gov.uk/dataset/local-authority-maintained-trees)
- NYC OpenData - [2015 Street Tree Census - Tree Data](https://data.cityofnewyork.us/Environment/2015-Street-Tree-Census-Tree-Data/uvpi-gqnh)
- Open Data BDN - [Street trees of the city of Barcelona](https://opendata-ajuntament.barcelona.cat/data/dataset/arbrat-viari)
- Open Data Bristol - [Trees](https://opendata.bristol.gov.uk/datasets/7a99218a4bf347ff948f0e5882406a8c)
- Open Data NI - [Belfast City Trees](https://admin.opendatani.gov.uk/dataset/belfast-trees)
- Denver Open data - [Tree Inventory](https://denvergov.org/opendata/dataset/city-and-county-of-denver-tree-inventory)
- Open Data DK - [City of Copenhagen Trees](https://www.opendata.dk/city-of-copenhagen/trae-basis-kommunale-traeer)
- Palo Alto Open Data - [Palo Alto Trees](https://data.cityofpaloalto.org/dataviews/73226/palo-alto-trees/)
- Fingal County Council Open Data - [Fingal County Council Trees](https://data.fingal.ie/maps/1e5f9db62e53443d946c15a1a06fd98b_0/explore)
- Data SA - [City of Adelaide Street Trees](https://data.sa.gov.au/data/dataset/street-trees)
- Open Data Boulder Colorado - [Tree Inventory Open Data](https://open-data.bouldercolorado.gov/datasets/dbbae8bdb0a44d17934243b88e85ef2b)
- Biodiversity Ireland - [Hertitage Trees Ireland](https://maps.biodiversityireland.ie/Dataset/27)
- Birmingham City Council Trees
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
The data is free to be used for research purposes subject to the cc-by-nc-4.0 licence and suitable attribution, please see the citation section below
Some potential uses might include;
- Investigations into urban tree biodiversity.
- The development of algorithms for extracting tree attributes via photos or streetview imagery.
- A tree species detection app.
- The detection trees of via satellite imagery.
- Species identfiication via hyperspectral tree.
It worth noting that for most use-cases cleaning, analysis and processing of data will be necessary. The completeness of tree inventory data varies greatly and users were not directed in anyway in terms of how to frame the photos they took and uploaded via the Curio app.
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
### TaggedTrees
Number of data points: 2,593,139
The details of an individual tree including its location, species, diameter at breast height (dbh), vitality etc. when available
### Images
Number of data points: 27,288
The details of images that were uploaded to the platform. The path to the actual image uploaded, this can be found in uploads directory. The details of what the image was attached to which usually was a ‘Story” that was then attached to a tree are also included.
### Uploads:
The set of images referenced in the images data file. The set of images was quite large even when zipped and so was broken up into 10gb chunks. Download each of the chunks and then run unzip on the uploads.zip file
A folder containing downsized versions of the images based on a fixed width has also been included - resized-uploads-width1200.zip
### Stories:
The details of a story that was attached to tree
### Notes:
The text included in a story/note about a tree.
### Conversations & Comments:
Comments grouped by conversations linked to a particular Story
### TreeSpecies
The tree species dictionary we built to support the platform. Each TaggedTree has a tree_species_id that references an entry in this dictionary when populated.
### TreeSpeciesAliases
The local names across multiple languages that can used to describe a species of tree contained in the TreeSpecies dictionary
### Tags and Taggings
Trees could be tagged with details such as diseased, monitored, newly_planted, apples, overhead cables etc. Anything at all really that could later be used to filter, group or identify trees of interest as well describe their state.
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
The goal of the Curio platform was to educate, engage and democratised access to environmenatal information. Making the data collected on the platform available in this form is seen as an extension of that mission.
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
All data was collected via the Curio app by its community. Where inventory data was uploaded in bulk we preprocessed the data to ensure details such as species information where mapped to the species dictionary we deinfed and that has been included in this release.
Before making the data available on this platform we decided to run face detection and blur any obvious, detectable faces found in the images that have been included.
<!-- #### Who are the source data producers? -->
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
<!-- #### Personal and Sensitive Information -->
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
<!-- ## Bias, Risks, and Limitations -->
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
@misc{CurioTreeData,
title = {The Curio Tree Dataset},
author = {Conor Nugent and Paul Hickey},
year = {2023},
publisher = {HuggingFace},
journal = {HuggingFace repository},
howpublished = {\url{https://https://huggingface.co/datasets/BreadboardLabs/CurioTreeData}},
}
## Dataset Card Authors
Conor Nugent and Paul Hickey
## Dataset Card Contact
[Conor Nugent](https://www.linkedin.com/in/conor-nugent-5b02458/?originalSubdomain=ie) |
jlbaker361/anime_faces_20k | ---
dataset_info:
features:
- name: image
dtype: image
- name: split
dtype: string
- name: src
dtype: string
- name: style
dtype: string
splits:
- name: train
num_bytes: 1078772576.0
num_examples: 20000
download_size: 1090696648
dataset_size: 1078772576.0
---
# Dataset Card for "anime_faces_20k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Amirjalaly/khabarfoori | ---
dataset_info:
features:
- name: keywords
dtype: string
- name: source
dtype: string
- name: id
dtype: int64
- name: path
dtype: string
- name: body
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 2734282299
num_examples: 684627
download_size: 1198355935
dataset_size: 2734282299
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
EleutherAI/quirky_hemisphere_alice | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: float64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 376019.1760309622
num_examples: 3747
- name: validation
num_bytes: 200694.0
num_examples: 2000
- name: test
num_bytes: 200545.5
num_examples: 2000
download_size: 196915
dataset_size: 777258.6760309623
---
# Dataset Card for "quirky_hemisphere_alice"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
levancuter/process_classification | ---
license: gpl-3.0
---
|
JBenjamin25/CVU | ---
license: openrail
---
|
LiveEvil/ImRealSrry | ---
license: bigscience-openrail-m
---
|
liuyanchen1015/MULTI_VALUE_sst2_will_would | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 4064
num_examples: 26
- name: test
num_bytes: 7401
num_examples: 52
- name: train
num_bytes: 113281
num_examples: 918
download_size: 58340
dataset_size: 124746
---
# Dataset Card for "MULTI_VALUE_sst2_will_would"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-launch__gov_report-plain_text-1abd3a-16146233 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- launch/gov_report
eval_info:
task: summarization
model: google/bigbird-pegasus-large-pubmed
metrics: ['bertscore']
dataset_name: launch/gov_report
dataset_config: plain_text
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/bigbird-pegasus-large-pubmed
* Dataset: launch/gov_report
* Config: plain_text
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nonchalant-nagavalli](https://huggingface.co/nonchalant-nagavalli) for evaluating this model. |
NeuroBench/mswc_fscil_subset | ---
license: cc-by-4.0
---
This is a subset of the [Multilingual Spoken Word Corpus](https://huggingface.co/datasets/MLCommons/ml_spoken_words) dataset, which is built specifically for the Few-shot Class-incremental Learning ([FSCIL](https://github.com/xyutao/fscil)) task.
A total of 15 languages are chosen, split into 5 base languages (English, German, Catalan, French, Kinyarwanda) and 10 incrementally learned languages (Persian, Spanish, Russian, Welsh, Italian, Basque, Polish, Esparanto, Portuguese, Dutch).
The FSCIL task entails first training a model using abundant training data on words from the 5 base languages, then in subsequent incremental sessions the model must learn new words from an incremental language with few training examples for each, while retaining knowledge of all prior learned words.
Each of the 5 base languages consists of 20 classes, with 500/100/100 samples for train/val/test splits each.
Each of the 10 incremental languages consists of 10 classes, each with 200 available samples. From these, a small number (e.g., 5) will be chosen for few-shot training, and 100 other samples are chosen for testing.
Thus, the model first has a knowledge base of 100 words from the base classes, which expands to 200 words by the end of all incremental sessions.
By default, the NeuroBench harness will install the 48kHz opus formatted data. Converted audio files to 16kHz wav is also available to be downloaded from this repository.
|
abid/indonesia-bioner-dataset | ---
license: bsd-3-clause-clear
---
### Indonesia BioNER Dataset
This dataset taken from online health consultation platform Alodokter.com which has been annotated by two medical doctors. Data were annotated using IOB in CoNLL format.
Dataset contains 2600 medical answers by doctors from 2017-2020. Two medical experts were assigned to annotate the data into two entity types: DISORDERS and ANATOMY. The topics of answers are: diarrhea, HIV-AIDS, nephrolithiasis and TBC, which marked as high-risk dataset from WHO.
This work is possible by generous support from Dr. Diana Purwitasari and Safitri Juanita.
> Note: this data is provided as is in Bahasa Indonesia. No translations are provided.
| File | Amount |
|-------------|--------|
| train.conll | 1950 |
| valid.conll | 260 |
| test.conll | 390 | |
open-llm-leaderboard/details_ChavyvAkvar__habib-v3 | ---
pretty_name: Evaluation run of ChavyvAkvar/habib-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ChavyvAkvar/habib-v3](https://huggingface.co/ChavyvAkvar/habib-v3) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ChavyvAkvar__habib-v3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-05T21:08:48.723483](https://huggingface.co/datasets/open-llm-leaderboard/details_ChavyvAkvar__habib-v3/blob/main/results_2024-04-05T21-08-48.723483.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6350330280891581,\n\
\ \"acc_stderr\": 0.0323878975631387,\n \"acc_norm\": 0.6368677664217826,\n\
\ \"acc_norm_stderr\": 0.033040441319805824,\n \"mc1\": 0.3733170134638923,\n\
\ \"mc1_stderr\": 0.01693237055757063,\n \"mc2\": 0.5350945498925448,\n\
\ \"mc2_stderr\": 0.01502938124934662\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.606655290102389,\n \"acc_stderr\": 0.014275101465693026,\n\
\ \"acc_norm\": 0.6348122866894198,\n \"acc_norm_stderr\": 0.0140702655192688\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6313483369846644,\n\
\ \"acc_stderr\": 0.004814532642574655,\n \"acc_norm\": 0.8302131049591714,\n\
\ \"acc_norm_stderr\": 0.0037467817125096518\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544064,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544064\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406783,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406783\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.04451807959055328,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.04451807959055328\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.03510766597959217,\n\
\ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.03510766597959217\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635484,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635484\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228402,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228402\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.039439666991836285,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.039439666991836285\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976044,\n \"\
acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976044\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n\
\ \"acc_stderr\": 0.024161618127987745,\n \"acc_norm\": 0.8376068376068376,\n\
\ \"acc_norm_stderr\": 0.024161618127987745\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n\
\ \"acc_stderr\": 0.014283378044296422,\n \"acc_norm\": 0.8007662835249042,\n\
\ \"acc_norm_stderr\": 0.014283378044296422\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31843575418994413,\n\
\ \"acc_stderr\": 0.015581008080360276,\n \"acc_norm\": 0.31843575418994413,\n\
\ \"acc_norm_stderr\": 0.015581008080360276\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890165,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890165\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4556714471968709,\n\
\ \"acc_stderr\": 0.0127199495430322,\n \"acc_norm\": 0.4556714471968709,\n\
\ \"acc_norm_stderr\": 0.0127199495430322\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n\
\ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.619281045751634,\n \"acc_stderr\": 0.019643801557924806,\n \
\ \"acc_norm\": 0.619281045751634,\n \"acc_norm_stderr\": 0.019643801557924806\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3733170134638923,\n\
\ \"mc1_stderr\": 0.01693237055757063,\n \"mc2\": 0.5350945498925448,\n\
\ \"mc2_stderr\": 0.01502938124934662\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.011524466954090255\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6027293404094011,\n \
\ \"acc_stderr\": 0.0134786596523378\n }\n}\n```"
repo_url: https://huggingface.co/ChavyvAkvar/habib-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|arc:challenge|25_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|gsm8k|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hellaswag|10_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T21-08-48.723483.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T21-08-48.723483.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- '**/details_harness|winogrande|5_2024-04-05T21-08-48.723483.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-05T21-08-48.723483.parquet'
- config_name: results
data_files:
- split: 2024_04_05T21_08_48.723483
path:
- results_2024-04-05T21-08-48.723483.parquet
- split: latest
path:
- results_2024-04-05T21-08-48.723483.parquet
---
# Dataset Card for Evaluation run of ChavyvAkvar/habib-v3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ChavyvAkvar/habib-v3](https://huggingface.co/ChavyvAkvar/habib-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ChavyvAkvar__habib-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-05T21:08:48.723483](https://huggingface.co/datasets/open-llm-leaderboard/details_ChavyvAkvar__habib-v3/blob/main/results_2024-04-05T21-08-48.723483.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6350330280891581,
"acc_stderr": 0.0323878975631387,
"acc_norm": 0.6368677664217826,
"acc_norm_stderr": 0.033040441319805824,
"mc1": 0.3733170134638923,
"mc1_stderr": 0.01693237055757063,
"mc2": 0.5350945498925448,
"mc2_stderr": 0.01502938124934662
},
"harness|arc:challenge|25": {
"acc": 0.606655290102389,
"acc_stderr": 0.014275101465693026,
"acc_norm": 0.6348122866894198,
"acc_norm_stderr": 0.0140702655192688
},
"harness|hellaswag|10": {
"acc": 0.6313483369846644,
"acc_stderr": 0.004814532642574655,
"acc_norm": 0.8302131049591714,
"acc_norm_stderr": 0.0037467817125096518
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544064,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544064
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406783,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.04451807959055328,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.04451807959055328
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.03510766597959217,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.03510766597959217
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635484,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635484
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228402,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228402
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.039439666991836285,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.039439666991836285
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976044,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976044
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.024161618127987745,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.024161618127987745
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8007662835249042,
"acc_stderr": 0.014283378044296422,
"acc_norm": 0.8007662835249042,
"acc_norm_stderr": 0.014283378044296422
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31843575418994413,
"acc_stderr": 0.015581008080360276,
"acc_norm": 0.31843575418994413,
"acc_norm_stderr": 0.015581008080360276
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890165,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890165
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4556714471968709,
"acc_stderr": 0.0127199495430322,
"acc_norm": 0.4556714471968709,
"acc_norm_stderr": 0.0127199495430322
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.619281045751634,
"acc_stderr": 0.019643801557924806,
"acc_norm": 0.619281045751634,
"acc_norm_stderr": 0.019643801557924806
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827075,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827075
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.032659863237109066,
"acc_norm": 0.88,
"acc_norm_stderr": 0.032659863237109066
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3733170134638923,
"mc1_stderr": 0.01693237055757063,
"mc2": 0.5350945498925448,
"mc2_stderr": 0.01502938124934662
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.011524466954090255
},
"harness|gsm8k|5": {
"acc": 0.6027293404094011,
"acc_stderr": 0.0134786596523378
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Sharathhebbar24/databricks-dolly-15k | ---
dataset_info:
features:
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 12664945
num_examples: 15011
download_size: 7368629
dataset_size: 12664945
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
task_categories:
- text-generation
language:
- en
pretty_name: dolly
size_categories:
- 10K<n<100K
---
# Databricks-dolly
This is a cleansed version of [databricks/databricks-dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k)
## Usage
```python
from datasets import load_dataset
dataset = load_dataset("Sharathhebbar24/databricks-dolly-15k", split="train")
``` |
silk-road/ragged_CharacterEval | ---
license: cc-by-sa-4.0
---
|
jganzabalseenka/code-text-for-lm-scratch | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 8618263476
num_examples: 16702061
- name: valid
num_bytes: 48072624
num_examples: 93164
download_size: 650619551
dataset_size: 8666336100
---
# Dataset Card for "code-text-for-lm-scratch"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
strombergnlp/bajer_danish_misogyny_preview | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- da
license: other
multilinguality:
- monolingual
pretty_name: 'BAJER: Annotations for Misogyny'
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- hate-speech-detection
paperswithcode_id: bajer-danish-misogyny
tags:
- not-for-all-audiences
extra_gated_prompt: "Warning: this repository contains harmful content (abusive language, hate speech, stereotypes)."
---
# Dataset Card for "Bajer"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://stromberg.ai/publication/aom/](https://stromberg.ai/publication/aom/)
- **Repository:** [https://github.com/StrombergNLP/Online-Misogyny-in-Danish-Bajer](https://github.com/StrombergNLP/Online-Misogyny-in-Danish-Bajer)
- **Paper:** [https://aclanthology.org/2021.acl-long.247/](https://aclanthology.org/2021.acl-long.247/)
- **Point of Contact:** [Leon Derczynski](https://github.com/leondz)
- **Size of downloaded dataset files:** 7.29 MiB
- **Size of the generated dataset:** 6.57 MiB
- **Total amount of disk used:** 13.85 MiB
### THIS PUBLIC-FACING DATASET IS A PREVIEW ONLY
This is a working data reader but the data here is just a preview of the full dataset, for safety & legal reasons.
To apply to access the entire dataset, complete this [form](https://forms.gle/MPdV8FG8EUuS1MdS6).
When you have the full data, amend `_URL` in `bajer.py` to point to the full data TSV's filename.
### Dataset Summary
This is a high-quality dataset of annotated posts sampled from social
media posts and annotated for misogyny. Danish language.
<iframe width="560" height="315" src="https://www.youtube.com/embed/xayfVkt7gwo" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
See the accompanying ACL paper [Annotating Online Misogyny](https://aclanthology.org/2021.acl-long.247/) for full details.
### Supported Tasks and Leaderboards
* [Hate Speech Detection on bajer_danish_misogyny](https://paperswithcode.com/sota/hate-speech-detection-on-bajer-danish)
### Languages
Danish (`bcp47:da`)
## Dataset Structure
### Data Instances
#### Bajer
In this preview: 10 instances
In the full dataset:
- **Size of downloaded dataset files:** 7.29 MiB
- **Size of the generated dataset:** 6.57 MiB
- **Total amount of disk used:** 13.85 MiB
See above (or below) for how to get the full dataset.
An example of 'train' looks as follows.
```
{
'id': '0',
'dataset_id': '0',
'label_id': '0',
'text': 'Tilfældigt hva, din XXXXXXXXXX 🤬🤬🤬',
'sampling': 'keyword_twitter',
'subtask_A': 1,
'subtask_B': 0,
'subtask_C1': 3,
'subtask_C2': 6
}
```
### Data Fields
- `id`: a `string` feature, unique identifier in this dataset.
- `dataset_id`: a `string` feature, internal annotation identifier.
- `label_id`: a `string` feature, internal annotation sequence number.
- `text`: a `string` of the text that's annotated.
- `sampling`: a `string` describing which sampling technique surfaced this message
- `subtask_A`: is the text abusive `ABUS` or not `NOT`? `0: NOT, 1: ABUS`
- `subtask_B`: for abusive text, what's the target - individual `IND`, group `GRP`, other `OTH`, or untargeted `UNT`? `0: IND, 1: GRP, 2: OTH, 3: UNT, 4: not applicable`
- `subtask_C1`: for group-targeted abuse, what's the group - misogynistic `SEX`, other `OTH`, or racist `RAC`? `0: SEX, 1: OTH, 2: RAC, 3: not applicable`
- `subtask_C2`: for misogyny, is it neosexist `NEOSEX`, discrediting `DISCREDIT`, normative stereotyping `NOR`, benevolent sexism `AMBIVALENT`, dominance `DOMINANCE`, or harassment `HARASSMENT`? `0: NEOSEX, 1: DISCREDIT, 2: NOR, 3: AMBIVALENT, 4: DOMINANCE, 5: HARASSMENT, 6: not applicable`
### Data Splits
In the full dataset:
| name |train|
|---------|----:|
|bajer|27880 sentences|
This preview has only 10 sentences - the link for access to the full data is given at the top of this page.
## Dataset Creation
### Curation Rationale
The goal was to collect data for developing an annotation schema of online misogyny.
Random sampling of text often results in scarcity of examples of specifically misogynistic content (e.g. (Wulczyn et al., 2017;
Founta et al., 2018)). Therefore, we used the common alternative of collecting data by using predefined keywords with a potentially high search hit
(e.g. Waseem and Hovy (2016)), and identifying
relevant user-profiles (e.g. (Anzovino et al., 2018))
and related topics (e.g. (Kumar et al., 2018)).
We searched for keywords (specific slurs, hashtags), that are known to occur in sexist posts. These
were defined by previous work, a slur list from
Reddit, and from interviews and surveys of online
misogyny among women. We also searched for
broader terms like “sex” or “women”, which do
not appear exclusively in a misogynistic context,
for example in the topic search, where we gathered
relevant posts and their comments from the social
media pages of public media. A complete list of
keywords can be found in the appendix.
Social media provides a potentially biased, but
broad snapshot of online human discourse, with
plenty of language and behaviours represented. Following best practice guidelines (Vidgen and Derczynski, 2020), we sampled from a language for
which there are no existing annotations of the target
phenomenon: Danish.
Different social media platforms attract different user groups and can exhibit domain-specific
language (Karan and Snajder ˇ , 2018). Rather than
choosing one platform (existing misogyny datasets
are primarily based on Twitter and Reddit (Guest
et al., 2021)), we sampled from multiple platforms:
Statista (2020) shows that the platform where most
Danish users are present is Facebook, followed
by Twitter, YouTube, Instagram and lastly, Reddit.
The dataset was sampled from Twitter, Facebook
and Reddit posts as plain text.
### Source Data
#### Initial Data Collection and Normalization
The dataset was sampled from Twitter, Facebook
and Reddit posts as plain text. Data was gathered based on: keyword-based search (i.e. purposive sampling); topic-based search; and content from specific users.
#### Who are the source language producers?
Danish-speaking social media users
### Annotations
#### Annotation process
In annotating our dataset, we built on the MATTER
framework (Pustejovsky and Stubbs, 2012) and use
the variation presented by Finlayson and Erjavec
(2017) (the MALER framework), where the Train & Test stages are replaced by Leveraging of annotations for one’s particular goal, in our case the
creation of a comprehensive taxonomy.
We created a set of guidelines for the annotators.
The annotators were first asked to read the guidelines and individually annotate about 150 different
posts, after which there was a shared discussion.
After this pilot round, the volume of samples per annotator was increased and every sample labeled by
2-3 annotators. When instances were ‘flagged’ or
annotators disagreed on them, they were discussed
during weekly meetings, and misunderstandings
were resolved together with the external facilitator. After round three, when reaching 7k annotated
posts (Figure 2), we continued with independent
annotations maintaining a 15% instance overlap
between randomly picked annotator pairs.
Management of annotator disagreement is an important part of the process design. Disagreements
can be solved by majority voting (Davidson et al.,
2017; Wiegand et al., 2019), labeled as abuse if at
least one annotator has labeled it (Golbeck et al.,
2017) or by a third objective instance (Gao and
Huang, 2017). Most datasets use crowdsourcing
platforms or a few academic experts for annotation
(Vidgen and Derczynski, 2020). Inter-annotatoragreement (IAA) and classification performance
are established as two grounded evaluation measurements for annotation quality (Vidgen and Derczynski, 2020). Comparing the performance of amateur annotators (while providing guidelines) with
expert annotators for sexism and racism annotation,
Waseem (2016) show that the quality of amateur
annotators is competitive with expert annotations
when several amateurs agree. Facing the trade-off
between training annotators intensely and the number of involved annotators, we continued with the
trained annotators and group discussions/ individual revisions for flagged content and disagreements
(Section 5.4).
#### Who are the annotators?
Demographic category|Value
---|---
Gender|6 female, 2 male (8 total)
Age:| 5 <30; 3 ≥30
Ethnicity:| 5 Danish: 1 Persian, 1 Arabic, 1 Polish
Study/occupation: | Linguistics (2); Health/Software Design; Ethnography/Digital Design; Communication/Psychology; Anthropology/Broadcast Moderator; Ethnography/Climate Change; Film Artist
### Personal and Sensitive Information
Usernames and PII were stripped during annotation process by: skipping content containing these; and eliding it from the final dataset.
## Considerations for Using the Data
### Social Impact of Dataset
The data contains abusive language. It may be possible to identify original speakers based on the content, so the data is only available for research purposes under a restrictive license and conditions. We hope that identifying sexism can help moderators. There is a possibility that the content here could be used to generate misogyny in Danish, which would place women in Denmark in an even more hostile environment, and for this reason data access is restricted and tracked.
### Discussion of Biases
We have taken pains to mitigate as many biases as we were aware of in this work.
**Selection biases:** Selection biases for abusive
language can be seen in the sampling of text, for instance when using keyword search (Wiegand et al.,
2019), topic dependency (Ousidhoum et al., 2020), users (Wiegand et al., 2019), domain (Wiegand
et al., 2019), time (Florio et al., 2020) and lack of
linguistic variety (Vidgen and Derczynski, 2020).
**Label biases:** Label biases can be caused by, for
instance, non-representative annotator selection,
lack in training/domain expertise, preconceived
notions, or pre-held stereotypes. These biases are
treated in relation to abusive language datasets
by several sources, e.g. general sampling and
annotators biases (Waseem, 2016; Al Kuwatly
et al., 2020), biases towards minority identity
mentions based for example on gender or race
(Davidson et al., 2017; Dixon et al., 2018; Park
et al., 2018; Davidson et al., 2019), and political
annotator biases (Wich et al., 2020). Other qualitative biases comprise, for instance, demographic
bias, over-generalization, topic exposure as social
biases (Hovy and Spruit, 2016).
We applied several measures to mitigate biases
occurring through the annotation design and execution: First, we selected labels grounded in existing,
peer-reviewed research from more than one field.
Second, we aimed for diversity in annotator profiles
in terms of age, gender, dialect, and background.
Third, we recruited a facilitator with a background
in ethnographic studies and provided intense annotator training. Fourth, we engaged in weekly group
discussions, iteratively improving the codebook
and integrating edge cases. Fifth, the selection of
platforms from which we sampled data is based on
local user representation in Denmark, rather than
convenience. Sixth, diverse sampling methods for
data collection reduced selection biases.
### Other Known Limitations
The data is absolutely NOT a reasonable or in any way stratified sample of social media text, so class prevalence/balance here says nothing about incidences of these phenomena in the wild. That said, we hypothesis that the distribution of types of misogyny in this data (subtask C2) is roughly representative of how misogyny presents on the studied platforms.
## Additional Information
### Dataset Curators
The dataset is curated by the paper's authors and the ethnographer-led annotation team.
### Licensing Information
The data is licensed under a restrictive usage agreement. [Apply for access here](https://forms.gle/MPdV8FG8EUuS1MdS6)
### Citation Information
```
@inproceedings{zeinert-etal-2021-annotating,
title = "Annotating Online Misogyny",
author = "Zeinert, Philine and
Inie, Nanna and
Derczynski, Leon",
booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.acl-long.247",
doi = "10.18653/v1/2021.acl-long.247",
pages = "3181--3197",
}
```
### Contributions
Author-added dataset [@leondz](https://github.com/leondz)
|
AdapterOcean/med_alpaca_standardized_cluster_86_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 8815581
num_examples: 15798
download_size: 4485749
dataset_size: 8815581
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_86_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
result-kand2-sdxl-wuerst-karlo/d8b81ca5 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 163
num_examples: 10
download_size: 1299
dataset_size: 163
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "d8b81ca5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gonglinyuan/safim | ---
license: cc-by-4.0
task_categories:
- text2text-generation
language:
- en
tags:
- code-generation
- code-infilling
- fill-in-the-middle
pretty_name: SAFIM
size_categories:
- 10K<n<100K
configs:
- config_name: block
data_files:
- split: test
path: block_completion.jsonl.gz
- config_name: control
data_files:
- split: test
path: control_completion.jsonl.gz
- config_name: api
data_files:
- split: test
path: api_completion.jsonl.gz
- config_name: block_v2
data_files:
- split: test
path: block_completion_v2.jsonl.gz
---
# SAFIM Benchmark
Syntax-Aware Fill-in-the-Middle (SAFIM) is a benchmark for evaluating Large Language Models (LLMs) on
the code Fill-in-the-Middle (FIM) task. SAFIM has three subtasks: Algorithmic Block Completion,
Control-Flow Expression Completion, and API Function Call Completion. SAFIM is sourced from code
submitted from April 2022 to January 2023 to minimize the impact of data contamination on evaluation
results.
- Authors: [Linyuan Gong](https://gonglinyuan.com), Sida Wang, Mostafa Elhoushi, Alvin Cheung
- Paper: [https://arxiv.org/abs/2403.04814](https://arxiv.org/abs/2403.04814)
- Leaderboard: [https://safimbenchmark.com](https://safimbenchmark.com)
- Code & Submission Instructions: [https://github.com/gonglinyuan/safim](https://github.com/gonglinyuan/safim)
## Copyright Information
The SAFIM benchmark is partially derived from problem descriptions and code solutions from
[https://codeforces.com](https://codeforces.com). According to the license of CodeForces, you may publish the texts of
Codeforces problems in any open sources, but you must preserve a direct link to the site.
## Citation
```
@article{
safim,
title={Evaluation of {LLM}s on Syntax-Aware Code Fill-in-the-Middle Tasks},
url={http://arxiv.org/abs/2403.04814},
note={arXiv:2403.04814 [cs]},
number={arXiv:2403.04814},
publisher={arXiv},
author={Gong, Linyuan and Wang, Sida and Elhoushi, Mostafa and Cheung, Alvin},
year={2024},
month=mar
}
``` |
open-llm-leaderboard/details_yleo__ParrotOgno-7B | ---
pretty_name: Evaluation run of yleo/ParrotOgno-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yleo/ParrotOgno-7B](https://huggingface.co/yleo/ParrotOgno-7B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yleo__ParrotOgno-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-15T16:28:55.072793](https://huggingface.co/datasets/open-llm-leaderboard/details_yleo__ParrotOgno-7B/blob/main/results_2024-02-15T16-28-55.072793.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.651472054199089,\n\
\ \"acc_stderr\": 0.0320071819287666,\n \"acc_norm\": 0.6506761514645453,\n\
\ \"acc_norm_stderr\": 0.03267799309361849,\n \"mc1\": 0.6181150550795593,\n\
\ \"mc1_stderr\": 0.017008101939163498,\n \"mc2\": 0.7652952718521188,\n\
\ \"mc2_stderr\": 0.013990406463043562\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7141638225255973,\n \"acc_stderr\": 0.013203196088537372,\n\
\ \"acc_norm\": 0.7303754266211604,\n \"acc_norm_stderr\": 0.012968040686869148\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.714299940250946,\n\
\ \"acc_stderr\": 0.004508239594503832,\n \"acc_norm\": 0.8902609042023502,\n\
\ \"acc_norm_stderr\": 0.0031192548288489453\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055273,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055273\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590167,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590167\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n\
\ \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903347,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903347\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4446927374301676,\n\
\ \"acc_stderr\": 0.01661988198817702,\n \"acc_norm\": 0.4446927374301676,\n\
\ \"acc_norm_stderr\": 0.01661988198817702\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182652,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182652\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.02575586592263295,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.02575586592263295\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n\
\ \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n\
\ \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6181150550795593,\n\
\ \"mc1_stderr\": 0.017008101939163498,\n \"mc2\": 0.7652952718521188,\n\
\ \"mc2_stderr\": 0.013990406463043562\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.846093133385951,\n \"acc_stderr\": 0.010141944523750035\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6959818043972706,\n \
\ \"acc_stderr\": 0.012670420440198664\n }\n}\n```"
repo_url: https://huggingface.co/yleo/ParrotOgno-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|arc:challenge|25_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|gsm8k|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hellaswag|10_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T16-28-55.072793.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-15T16-28-55.072793.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- '**/details_harness|winogrande|5_2024-02-15T16-28-55.072793.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-15T16-28-55.072793.parquet'
- config_name: results
data_files:
- split: 2024_02_15T16_28_55.072793
path:
- results_2024-02-15T16-28-55.072793.parquet
- split: latest
path:
- results_2024-02-15T16-28-55.072793.parquet
---
# Dataset Card for Evaluation run of yleo/ParrotOgno-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yleo/ParrotOgno-7B](https://huggingface.co/yleo/ParrotOgno-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yleo__ParrotOgno-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T16:28:55.072793](https://huggingface.co/datasets/open-llm-leaderboard/details_yleo__ParrotOgno-7B/blob/main/results_2024-02-15T16-28-55.072793.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.651472054199089,
"acc_stderr": 0.0320071819287666,
"acc_norm": 0.6506761514645453,
"acc_norm_stderr": 0.03267799309361849,
"mc1": 0.6181150550795593,
"mc1_stderr": 0.017008101939163498,
"mc2": 0.7652952718521188,
"mc2_stderr": 0.013990406463043562
},
"harness|arc:challenge|25": {
"acc": 0.7141638225255973,
"acc_stderr": 0.013203196088537372,
"acc_norm": 0.7303754266211604,
"acc_norm_stderr": 0.012968040686869148
},
"harness|hellaswag|10": {
"acc": 0.714299940250946,
"acc_stderr": 0.004508239594503832,
"acc_norm": 0.8902609042023502,
"acc_norm_stderr": 0.0031192548288489453
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055273,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055273
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590167,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590167
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903347,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903347
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4446927374301676,
"acc_stderr": 0.01661988198817702,
"acc_norm": 0.4446927374301676,
"acc_norm_stderr": 0.01661988198817702
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182652,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182652
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.02575586592263295,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.02575586592263295
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.02619392354445412,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.02619392354445412
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6181150550795593,
"mc1_stderr": 0.017008101939163498,
"mc2": 0.7652952718521188,
"mc2_stderr": 0.013990406463043562
},
"harness|winogrande|5": {
"acc": 0.846093133385951,
"acc_stderr": 0.010141944523750035
},
"harness|gsm8k|5": {
"acc": 0.6959818043972706,
"acc_stderr": 0.012670420440198664
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
autoevaluate/autoeval-staging-eval-project-xsum-c7d88063-10885461 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-base-16384-book-summary
metrics: []
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-base-16384-book-summary
* Dataset: xsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
Multimodal-Fatima/OxfordPets_test_text_davinci_003_Visclues_ns_300 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: raw_prediction
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_12
num_bytes: 11719655.0
num_examples: 300
- name: fewshot_5
num_bytes: 10858951.0
num_examples: 300
download_size: 20270915
dataset_size: 22578606.0
---
# Dataset Card for "OxfordPets_test_text_davinci_003_Visclues_ns_300"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Ppoyaa__Lumina-2 | ---
pretty_name: Evaluation run of Ppoyaa/Lumina-2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Ppoyaa/Lumina-2](https://huggingface.co/Ppoyaa/Lumina-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Ppoyaa__Lumina-2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-10T12:15:24.619714](https://huggingface.co/datasets/open-llm-leaderboard/details_Ppoyaa__Lumina-2/blob/main/results_2024-04-10T12-15-24.619714.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6428610489851507,\n\
\ \"acc_stderr\": 0.03226063004209143,\n \"acc_norm\": 0.6445999566435736,\n\
\ \"acc_norm_stderr\": 0.032913910652058696,\n \"mc1\": 0.4394124847001224,\n\
\ \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.6024517231570853,\n\
\ \"mc2_stderr\": 0.015124147125707957\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6339590443686007,\n \"acc_stderr\": 0.01407722310847014,\n\
\ \"acc_norm\": 0.6621160409556314,\n \"acc_norm_stderr\": 0.01382204792228351\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.673272256522605,\n\
\ \"acc_stderr\": 0.004680582263524275,\n \"acc_norm\": 0.8602867954590719,\n\
\ \"acc_norm_stderr\": 0.0034598069913898376\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.025467149045469546,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.025467149045469546\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782658,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782658\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941197,\n\
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941197\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530336,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530336\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671631,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671631\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\
\ \"acc_stderr\": 0.013927751372001503,\n \"acc_norm\": 0.8135376756066411,\n\
\ \"acc_norm_stderr\": 0.013927751372001503\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4201117318435754,\n\
\ \"acc_stderr\": 0.016507671073256402,\n \"acc_norm\": 0.4201117318435754,\n\
\ \"acc_norm_stderr\": 0.016507671073256402\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\
\ \"acc_stderr\": 0.012744149704869649,\n \"acc_norm\": 0.4680573663624511,\n\
\ \"acc_norm_stderr\": 0.012744149704869649\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6470588235294118,\n \"acc_stderr\": 0.019333142020797164,\n \
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.019333142020797164\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.02892058322067561,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.02892058322067561\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169146,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4394124847001224,\n\
\ \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.6024517231570853,\n\
\ \"mc2_stderr\": 0.015124147125707957\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8145224940805051,\n \"acc_stderr\": 0.010923965303140505\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5837755875663382,\n \
\ \"acc_stderr\": 0.013577788334652662\n }\n}\n```"
repo_url: https://huggingface.co/Ppoyaa/Lumina-2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|arc:challenge|25_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|gsm8k|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hellaswag|10_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T12-15-24.619714.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-10T12-15-24.619714.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- '**/details_harness|winogrande|5_2024-04-10T12-15-24.619714.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-10T12-15-24.619714.parquet'
- config_name: results
data_files:
- split: 2024_04_10T12_15_24.619714
path:
- results_2024-04-10T12-15-24.619714.parquet
- split: latest
path:
- results_2024-04-10T12-15-24.619714.parquet
---
# Dataset Card for Evaluation run of Ppoyaa/Lumina-2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Ppoyaa/Lumina-2](https://huggingface.co/Ppoyaa/Lumina-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Ppoyaa__Lumina-2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-10T12:15:24.619714](https://huggingface.co/datasets/open-llm-leaderboard/details_Ppoyaa__Lumina-2/blob/main/results_2024-04-10T12-15-24.619714.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6428610489851507,
"acc_stderr": 0.03226063004209143,
"acc_norm": 0.6445999566435736,
"acc_norm_stderr": 0.032913910652058696,
"mc1": 0.4394124847001224,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.6024517231570853,
"mc2_stderr": 0.015124147125707957
},
"harness|arc:challenge|25": {
"acc": 0.6339590443686007,
"acc_stderr": 0.01407722310847014,
"acc_norm": 0.6621160409556314,
"acc_norm_stderr": 0.01382204792228351
},
"harness|hellaswag|10": {
"acc": 0.673272256522605,
"acc_stderr": 0.004680582263524275,
"acc_norm": 0.8602867954590719,
"acc_norm_stderr": 0.0034598069913898376
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.025467149045469546,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.025467149045469546
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782658,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782658
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941197,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941197
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530336,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530336
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671631,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671631
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281376,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281376
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001503,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001503
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4201117318435754,
"acc_stderr": 0.016507671073256402,
"acc_norm": 0.4201117318435754,
"acc_norm_stderr": 0.016507671073256402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869649,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869649
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.019333142020797164,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.019333142020797164
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.02892058322067561,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.02892058322067561
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169146,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169146
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4394124847001224,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.6024517231570853,
"mc2_stderr": 0.015124147125707957
},
"harness|winogrande|5": {
"acc": 0.8145224940805051,
"acc_stderr": 0.010923965303140505
},
"harness|gsm8k|5": {
"acc": 0.5837755875663382,
"acc_stderr": 0.013577788334652662
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jahb57/bert_embeddings_BATCH_7 | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: last_hidden_state
sequence:
sequence: float32
- name: pooler_output
sequence: float32
splits:
- name: train
num_bytes: 19472263635
num_examples: 100000
download_size: 19593248304
dataset_size: 19472263635
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/tiamat_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tiamat (Granblue Fantasy)
This is the dataset of tiamat (Granblue Fantasy), containing 31 images and their tags.
The core tags of this character are `long_hair, blue_hair, pointy_ears, breasts, very_long_hair, red_eyes, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 31 | 23.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamat_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 31 | 19.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamat_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 57 | 31.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamat_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 31 | 22.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamat_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 57 | 37.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamat_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tiamat_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------|
| 0 | 31 |  |  |  |  |  | 1girl, solo, bare_shoulders, navel, cleavage, looking_at_viewer |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | bare_shoulders | navel | cleavage | looking_at_viewer |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------------|:--------|:-----------|:--------------------|
| 0 | 31 |  |  |  |  |  | X | X | X | X | X | X |
|
Juanid14317/EngSentimentAnalysis | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 2004838.9795756638
num_examples: 22208
- name: test
num_bytes: 501300.0204243363
num_examples: 5553
download_size: 1504880
dataset_size: 2506139.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
communityai/cognitivecomputations___samantha-data | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 16061925.0
num_examples: 5228
download_size: 8203537
dataset_size: 16061925.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_SanjiWatsuki__Silicon-Maid-7B | ---
pretty_name: Evaluation run of SanjiWatsuki/Silicon-Maid-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SanjiWatsuki/Silicon-Maid-7B](https://huggingface.co/SanjiWatsuki/Silicon-Maid-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SanjiWatsuki__Silicon-Maid-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-28T13:41:56.835099](https://huggingface.co/datasets/open-llm-leaderboard/details_SanjiWatsuki__Silicon-Maid-7B/blob/main/results_2023-12-28T13-41-56.835099.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6484292617885924,\n\
\ \"acc_stderr\": 0.032101605659034985,\n \"acc_norm\": 0.6501618417828356,\n\
\ \"acc_norm_stderr\": 0.0327423043582351,\n \"mc1\": 0.44063647490820074,\n\
\ \"mc1_stderr\": 0.017379697555437446,\n \"mc2\": 0.6163999701923091,\n\
\ \"mc2_stderr\": 0.015527755129556776\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6467576791808873,\n \"acc_stderr\": 0.013967822714840056,\n\
\ \"acc_norm\": 0.681740614334471,\n \"acc_norm_stderr\": 0.013611993916971453\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6853216490738897,\n\
\ \"acc_stderr\": 0.004634385694170046,\n \"acc_norm\": 0.865166301533559,\n\
\ \"acc_norm_stderr\": 0.0034084783337682664\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569526,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569526\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356852,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.029719142876342856,\n\
\ \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.029719142876342856\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.01577623925616325,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.01577623925616325\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503224,\n \
\ \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503224\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.03050028317654585,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.03050028317654585\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137276,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137276\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098823,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098823\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.01358661921990333,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.01358661921990333\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500097,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500097\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4201117318435754,\n\
\ \"acc_stderr\": 0.016507671073256402,\n \"acc_norm\": 0.4201117318435754,\n\
\ \"acc_norm_stderr\": 0.016507671073256402\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460842,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460842\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47979139504563234,\n\
\ \"acc_stderr\": 0.012759801427767564,\n \"acc_norm\": 0.47979139504563234,\n\
\ \"acc_norm_stderr\": 0.012759801427767564\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6470588235294118,\n \"acc_stderr\": 0.019333142020797164,\n \
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.019333142020797164\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44063647490820074,\n\
\ \"mc1_stderr\": 0.017379697555437446,\n \"mc2\": 0.6163999701923091,\n\
\ \"mc2_stderr\": 0.015527755129556776\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7900552486187845,\n \"acc_stderr\": 0.01144628062926263\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6194086429112965,\n \
\ \"acc_stderr\": 0.01337397127772981\n }\n}\n```"
repo_url: https://huggingface.co/SanjiWatsuki/Silicon-Maid-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|arc:challenge|25_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|gsm8k|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hellaswag|10_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-28T13-41-56.835099.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-28T13-41-56.835099.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- '**/details_harness|winogrande|5_2023-12-28T13-41-56.835099.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-28T13-41-56.835099.parquet'
- config_name: results
data_files:
- split: 2023_12_28T13_41_56.835099
path:
- results_2023-12-28T13-41-56.835099.parquet
- split: latest
path:
- results_2023-12-28T13-41-56.835099.parquet
---
# Dataset Card for Evaluation run of SanjiWatsuki/Silicon-Maid-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SanjiWatsuki/Silicon-Maid-7B](https://huggingface.co/SanjiWatsuki/Silicon-Maid-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SanjiWatsuki__Silicon-Maid-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-28T13:41:56.835099](https://huggingface.co/datasets/open-llm-leaderboard/details_SanjiWatsuki__Silicon-Maid-7B/blob/main/results_2023-12-28T13-41-56.835099.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6484292617885924,
"acc_stderr": 0.032101605659034985,
"acc_norm": 0.6501618417828356,
"acc_norm_stderr": 0.0327423043582351,
"mc1": 0.44063647490820074,
"mc1_stderr": 0.017379697555437446,
"mc2": 0.6163999701923091,
"mc2_stderr": 0.015527755129556776
},
"harness|arc:challenge|25": {
"acc": 0.6467576791808873,
"acc_stderr": 0.013967822714840056,
"acc_norm": 0.681740614334471,
"acc_norm_stderr": 0.013611993916971453
},
"harness|hellaswag|10": {
"acc": 0.6853216490738897,
"acc_stderr": 0.004634385694170046,
"acc_norm": 0.865166301533559,
"acc_norm_stderr": 0.0034084783337682664
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569526,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569526
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424649,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424649
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131147,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131147
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.029719142876342856,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.029719142876342856
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.01577623925616325,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.01577623925616325
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503224,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503224
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.03050028317654585,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.03050028317654585
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137276,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137276
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098823,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098823
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.01358661921990333,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.01358661921990333
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500097,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500097
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4201117318435754,
"acc_stderr": 0.016507671073256402,
"acc_norm": 0.4201117318435754,
"acc_norm_stderr": 0.016507671073256402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460842,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460842
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47979139504563234,
"acc_stderr": 0.012759801427767564,
"acc_norm": 0.47979139504563234,
"acc_norm_stderr": 0.012759801427767564
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.019333142020797164,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.019333142020797164
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44063647490820074,
"mc1_stderr": 0.017379697555437446,
"mc2": 0.6163999701923091,
"mc2_stderr": 0.015527755129556776
},
"harness|winogrande|5": {
"acc": 0.7900552486187845,
"acc_stderr": 0.01144628062926263
},
"harness|gsm8k|5": {
"acc": 0.6194086429112965,
"acc_stderr": 0.01337397127772981
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
suhas97/Llama-data-1 | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
splits:
- name: test
num_bytes: 1346303
num_examples: 1000
download_size: 789190
dataset_size: 1346303
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-2x7bRP-v8 | ---
pretty_name: Evaluation run of jsfs11/MixtureofMerges-MoE-2x7bRP-v8
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jsfs11/MixtureofMerges-MoE-2x7bRP-v8](https://huggingface.co/jsfs11/MixtureofMerges-MoE-2x7bRP-v8)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-2x7bRP-v8\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-02T15:51:07.328382](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-2x7bRP-v8/blob/main/results_2024-04-02T15-51-07.328382.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6484935940951713,\n\
\ \"acc_stderr\": 0.03215161145324121,\n \"acc_norm\": 0.6485309992755834,\n\
\ \"acc_norm_stderr\": 0.03281401676717664,\n \"mc1\": 0.5299877600979193,\n\
\ \"mc1_stderr\": 0.01747199209169754,\n \"mc2\": 0.6869107298362201,\n\
\ \"mc2_stderr\": 0.015089506509549466\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.697098976109215,\n \"acc_stderr\": 0.013428241573185349,\n\
\ \"acc_norm\": 0.7133105802047781,\n \"acc_norm_stderr\": 0.013214986329274777\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7168890659231228,\n\
\ \"acc_stderr\": 0.004495891440519419,\n \"acc_norm\": 0.880601473809998,\n\
\ \"acc_norm_stderr\": 0.0032359418109431538\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695255,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695255\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\
\ \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.6878612716763006,\n\
\ \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7806451612903226,\n \"acc_stderr\": 0.02354079935872329,\n \"\
acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.02354079935872329\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n \"acc_norm\"\
: 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083004,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083004\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.029472485833136094,\n\
\ \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.029472485833136094\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n\
\ \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.013547415658662257,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.013547415658662257\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468348,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468348\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4860335195530726,\n\
\ \"acc_stderr\": 0.01671597641074452,\n \"acc_norm\": 0.4860335195530726,\n\
\ \"acc_norm_stderr\": 0.01671597641074452\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n\
\ \"acc_stderr\": 0.012736153390214963,\n \"acc_norm\": 0.4634941329856584,\n\
\ \"acc_norm_stderr\": 0.012736153390214963\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170598,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170598\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6470588235294118,\n \"acc_stderr\": 0.019333142020797164,\n \
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.019333142020797164\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291296,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291296\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5299877600979193,\n\
\ \"mc1_stderr\": 0.01747199209169754,\n \"mc2\": 0.6869107298362201,\n\
\ \"mc2_stderr\": 0.015089506509549466\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.829518547750592,\n \"acc_stderr\": 0.010569021122825912\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.645185746777862,\n \
\ \"acc_stderr\": 0.01317908338797922\n }\n}\n```"
repo_url: https://huggingface.co/jsfs11/MixtureofMerges-MoE-2x7bRP-v8
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|arc:challenge|25_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|gsm8k|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hellaswag|10_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T15-51-07.328382.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T15-51-07.328382.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- '**/details_harness|winogrande|5_2024-04-02T15-51-07.328382.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-02T15-51-07.328382.parquet'
- config_name: results
data_files:
- split: 2024_04_02T15_51_07.328382
path:
- results_2024-04-02T15-51-07.328382.parquet
- split: latest
path:
- results_2024-04-02T15-51-07.328382.parquet
---
# Dataset Card for Evaluation run of jsfs11/MixtureofMerges-MoE-2x7bRP-v8
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jsfs11/MixtureofMerges-MoE-2x7bRP-v8](https://huggingface.co/jsfs11/MixtureofMerges-MoE-2x7bRP-v8) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-2x7bRP-v8",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-02T15:51:07.328382](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-2x7bRP-v8/blob/main/results_2024-04-02T15-51-07.328382.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6484935940951713,
"acc_stderr": 0.03215161145324121,
"acc_norm": 0.6485309992755834,
"acc_norm_stderr": 0.03281401676717664,
"mc1": 0.5299877600979193,
"mc1_stderr": 0.01747199209169754,
"mc2": 0.6869107298362201,
"mc2_stderr": 0.015089506509549466
},
"harness|arc:challenge|25": {
"acc": 0.697098976109215,
"acc_stderr": 0.013428241573185349,
"acc_norm": 0.7133105802047781,
"acc_norm_stderr": 0.013214986329274777
},
"harness|hellaswag|10": {
"acc": 0.7168890659231228,
"acc_stderr": 0.004495891440519419,
"acc_norm": 0.880601473809998,
"acc_norm_stderr": 0.0032359418109431538
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695255,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695255
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.02354079935872329,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.02354079935872329
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511657,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511657
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083004,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083004
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7100840336134454,
"acc_stderr": 0.029472485833136094,
"acc_norm": 0.7100840336134454,
"acc_norm_stderr": 0.029472485833136094
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.035477710041594654,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.035477710041594654
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.013547415658662257,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.013547415658662257
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468348,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468348
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4860335195530726,
"acc_stderr": 0.01671597641074452,
"acc_norm": 0.4860335195530726,
"acc_norm_stderr": 0.01671597641074452
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4634941329856584,
"acc_stderr": 0.012736153390214963,
"acc_norm": 0.4634941329856584,
"acc_norm_stderr": 0.012736153390214963
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170598,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170598
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.019333142020797164,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.019333142020797164
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291296,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482706,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5299877600979193,
"mc1_stderr": 0.01747199209169754,
"mc2": 0.6869107298362201,
"mc2_stderr": 0.015089506509549466
},
"harness|winogrande|5": {
"acc": 0.829518547750592,
"acc_stderr": 0.010569021122825912
},
"harness|gsm8k|5": {
"acc": 0.645185746777862,
"acc_stderr": 0.01317908338797922
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Alexvval/alexvalval | ---
license: cc
---
|
Atipico1/NQ-colbert-10k-case-entity | ---
dataset_info:
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: masked_query
dtype: string
- name: original_case
list:
- name: answer
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: unans_case
list:
- name: answer
dtype: string
- name: answers
sequence: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: conflict_case
list:
- name: answer
dtype: string
- name: conflict_context
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: context
dtype: string
- name: context_vague
dtype: string
- name: entity
dtype: string
- name: similar_entity
dtype: string
- name: similar_entity_score
dtype: float32
- name: random_entity
dtype: string
- name: random_entity_score
dtype: float64
splits:
- name: train
num_bytes: 105765666
num_examples: 6875
- name: test
num_bytes: 34382776
num_examples: 2230
download_size: 77391574
dataset_size: 140148442
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
bigscience-data/roots_ar_labr | ---
language: ar
license: gpl-2.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_ar_labr
# labr
- Dataset uid: `labr`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 0.0076 % of total
- 0.0701 % of ar
### BigScience processing steps
#### Filters applied to: ar
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
|
nu-delta/oxford-pets | ---
dataset_info:
features:
- name: image
dtype: image
- name: image_name
dtype: string
- name: breed
dtype: string
- name: dog
dtype: bool
- name: pose
dtype: string
- name: bbox
sequence: int64
- name: seg_mask
dtype: image
splits:
- name: train
num_bytes: 386851121.73
num_examples: 3685
download_size: 385801564
dataset_size: 386851121.73
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tollefj/multi-nli-NOB | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation_matched
path: data/validation_matched-*
- split: validation_mismatched
path: data/validation_mismatched-*
dataset_info:
features:
- name: promptID
dtype: int32
- name: pairID
dtype: string
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: genre
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 81317900
num_examples: 392702
- name: validation_matched
num_bytes: 2010024
num_examples: 9815
- name: validation_mismatched
num_bytes: 2121266
num_examples: 9832
download_size: 56640779
dataset_size: 85449190
license: cc-by-4.0
---
# Dataset Card for "multi-nli-NOB"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aquamansam/Shrekfest | ---
license: cc-by-4.0
---
|
andrewrreed/agents-benchmark-eval-results | ---
dataset_info:
features:
- name: agent_name
dtype: string
- name: agent_model_id
dtype: string
- name: question
dtype: string
- name: gt_answer
dtype: string
- name: prediction
dtype: string
- name: intermediate_steps
dtype: string
- name: parsing_error
dtype: bool
- name: iteration_limit_exceeded
dtype: bool
- name: agent_error
dtype: string
- name: tools_used
sequence: string
- name: number_distinct_tools_used
dtype: float64
- name: number_of_steps
dtype: float64
- name: prometheus_evaluator_model_id
dtype: string
- name: eval_score_prometheus
dtype: int64
- name: eval_feedback_prometheus
dtype: string
- name: openai_evaluator_model_id
dtype: string
- name: eval_score_openai
dtype: int64
- name: eval_feedback_openai
dtype: string
- name: start_time
dtype: timestamp[ns]
- name: end_time
dtype: timestamp[ns]
- name: task
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 680642
num_examples: 245
download_size: 262768
dataset_size: 680642
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
knowledgator/sentence_rex | ---
license: apache-2.0
task_categories:
- text-classification
- text2text-generation
language:
- en
tags:
- text classification
- relation extraction
size_categories:
- 10K<n<100K
---
### SentenceRex
This is a dataset for training zero-shot and few-shot sentence level relation extraction models.
The dataset was created with a distant supervision technique from Wikipedia.
After that, labels were manually checked to be logically consistent with a sentence. Overall, it consists of **847** unique relations.
Each entity between which there is a relation is tagged in the following way: <e1></e1> for the source entity and <e2></e2> for the target entity.
`labels` column indicates the relation name.
### Feedback
We value your input! Share your feedback and suggestions to help us improve our models and datasets.
Fill out the feedback [form](https://forms.gle/5CPFFuLzNWznjcpL7)
### Join Our Discord
Connect with our community on Discord for news, support, and discussion about our models and datasets.
Join [Discord](https://discord.gg/mfZfwjpB) |
open-llm-leaderboard/details_luffycodes__nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple | ---
pretty_name: Evaluation run of luffycodes/nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [luffycodes/nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple](https://huggingface.co/luffycodes/nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_luffycodes__nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T01:13:17.966803](https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple/blob/main/results_2023-10-15T01-13-17.966803.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.18760486577181207,\n\
\ \"em_stderr\": 0.003998023634854269,\n \"f1\": 0.2689041526845642,\n\
\ \"f1_stderr\": 0.00405255679434132,\n \"acc\": 0.4260142426906131,\n\
\ \"acc_stderr\": 0.010340665159137691\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.18760486577181207,\n \"em_stderr\": 0.003998023634854269,\n\
\ \"f1\": 0.2689041526845642,\n \"f1_stderr\": 0.00405255679434132\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10538286580742987,\n \
\ \"acc_stderr\": 0.008457575884041755\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7466456195737964,\n \"acc_stderr\": 0.012223754434233626\n\
\ }\n}\n```"
repo_url: https://huggingface.co/luffycodes/nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|arc:challenge|25_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T01_13_17.966803
path:
- '**/details_harness|drop|3_2023-10-15T01-13-17.966803.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T01-13-17.966803.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T01_13_17.966803
path:
- '**/details_harness|gsm8k|5_2023-10-15T01-13-17.966803.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T01-13-17.966803.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hellaswag|10_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T01_13_17.966803
path:
- '**/details_harness|winogrande|5_2023-10-15T01-13-17.966803.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T01-13-17.966803.parquet'
- config_name: results
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- results_2023-09-01T18:20:29.445308.parquet
- split: 2023_10_15T01_13_17.966803
path:
- results_2023-10-15T01-13-17.966803.parquet
- split: latest
path:
- results_2023-10-15T01-13-17.966803.parquet
---
# Dataset Card for Evaluation run of luffycodes/nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/luffycodes/nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [luffycodes/nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple](https://huggingface.co/luffycodes/nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_luffycodes__nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T01:13:17.966803](https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple/blob/main/results_2023-10-15T01-13-17.966803.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.18760486577181207,
"em_stderr": 0.003998023634854269,
"f1": 0.2689041526845642,
"f1_stderr": 0.00405255679434132,
"acc": 0.4260142426906131,
"acc_stderr": 0.010340665159137691
},
"harness|drop|3": {
"em": 0.18760486577181207,
"em_stderr": 0.003998023634854269,
"f1": 0.2689041526845642,
"f1_stderr": 0.00405255679434132
},
"harness|gsm8k|5": {
"acc": 0.10538286580742987,
"acc_stderr": 0.008457575884041755
},
"harness|winogrande|5": {
"acc": 0.7466456195737964,
"acc_stderr": 0.012223754434233626
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
duramen/better | ---
license: afl-3.0
---
|
v-xchen-v/celebamask_hq | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 2915979365.0
num_examples: 30000
download_size: 2915048406
dataset_size: 2915979365.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-train-v2-80000 | ---
dataset_info:
features:
- name: tables
sequence: string
- name: table_names
sequence: string
- name: query
dtype: string
- name: answer
dtype: string
- name: source
dtype: string
- name: target
dtype: string
- name: source_latex
dtype: string
- name: target_latex
dtype: string
- name: source_html
dtype: string
- name: target_html
dtype: string
- name: source_markdown
dtype: string
- name: target_markdown
dtype: string
splits:
- name: train
num_bytes: 5719992002
num_examples: 1000
download_size: 1157887504
dataset_size: 5719992002
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kqsong/InFoBench | ---
license: mit
language:
- en
pretty_name: InfoBench
size_categories:
- n<1K
---
# Dataset Card for InFoBench Dataset
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Usage](#dataset-usage)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Repository:** [InFoBench Repository](https://github.com/qinyiwei/InfoBench)
- **Paper:** [InFoBench: Evaluating Instruction Following Ability in Large Language Models](https://arxiv.org/pdf/2401.03601.pdf)
The InFoBench Dataset is an evaluation benchmark dataset containing 500 instructions and corresponding 2250 decomposed requirements.
## Dataset Usage
You can directly download it with huggingface datasets.
``` python
from datasets import load_dataset
dataset = load_dataset("kqsong/InFoBench")
```
## Dataset Structure
### Data Instances
For each instance, there is an instruction string, an input string (optional), a list of decomposed questions, and a list of the labels for each decomposed question.
```json
{
"id": "domain_oriented_task_215",
"input": "",
"category": "Business and Economics: Business Administration",
"instruction": "Generate a non-disclosure agreement of two pages (each page is limited to 250 words) for a software development project involving Party A and Party B. The confidentiality duration should be 5 years. \n\nThe first page should include definitions for key terms such as 'confidential information', 'disclosure', and 'recipient'. \n\nOn the second page, provide clauses detailing the protocol for the return or destruction of confidential information, exceptions to maintaining confidentiality, and the repercussions following a breach of the agreement. \n\nPlease indicate the separation between the first and second pages with a full line of dashed lines ('-----'). Also, make sure that each page is clearly labeled with its respective page number.",
"decomposed_questions": [
"Is the generated text a non-disclosure agreement?",
"Does the generated text consist of two pages?",
"Is each page of the generated text limited to 250 words?",
"Is the generated non-disclosure agreement for a software development project involving Party A and Party B?",
"Does the generated non-disclosure agreement specify a confidentiality duration of 5 years?",
"Does the first page of the generated non-disclosure agreement include definitions for key terms such as 'confidential information', 'disclosure', and 'recipient'?",
"Does the second page of the generated non-disclosure agreement provide clauses detailing the protocol for the return or destruction of confidential information?",
"Does the second page of the generated non-disclosure agreement provide exceptions to maintaining confidentiality?",
"Does the second page of the generated non-disclosure agreement provide the repercussions following a breach of the agreement?",
"Does the generated text indicate the separation between the first and second pages with a full line of dashed lines ('-----')?",
"Does the generated text ensure that each page is clearly labeled with its respective page number?"
],
"subset": "Hard_set",
"question_label": [
["Format"],
["Format", "Number"],
["Number"],
["Content"],
["Content"],
["Format", "Content"],
["Content"],
["Content"],
["Content"],
["Format"],
["Format"]
]
}
```
### Data Fields
- `id`: a string.
- `subset`: `Hard_Set` or `Easy_Set`.
- `category`: a string containing categorical information.
- `instruction`: a string containing instructions.
- `input`: a string, containing the context information, could be an empty string.
- `decomposed_questions`: a list of strings, each corresponding to a decomposed requirement.
- `question_label`: a list of list of strings, each list of strings containing a series of labels for the corresponding decomposed questions.
## Additional Information
### Licensing Information
The InFoBench Dataset version 1.0.0 is released under the [MIT LISENCE](https://github.com/qinyiwei/InfoBench/blob/main/LICENSE)
### Citation Information
```
@article{qin2024infobench,
title={InFoBench: Evaluating Instruction Following Ability in Large Language Models},
author={Yiwei Qin and Kaiqiang Song and Yebowen Hu and Wenlin Yao and Sangwoo Cho and Xiaoyang Wang and Xuansheng Wu and Fei Liu and Pengfei Liu and Dong Yu},
year={2024},
eprint={2401.03601},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
Rohan132/Deduplicated_Orca_dataset_ | ---
license: mit
---
|
HumanCompatibleAI/ppo-seals-Swimmer-v0 | ---
dataset_info:
features:
- name: obs
sequence:
sequence: float64
- name: acts
sequence:
sequence: float32
- name: infos
sequence: string
- name: terminal
dtype: bool
- name: rews
sequence: float64
splits:
- name: train
num_bytes: 128625365
num_examples: 104
download_size: 23073060
dataset_size: 128625365
---
# Dataset Card for "ppo-seals-Swimmer-v0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_200 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 17036514000.125
num_examples: 177375
download_size: 15231547113
dataset_size: 17036514000.125
---
# Dataset Card for "chunk_200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Luxem/Plant-Disease-Classification | ---
license: bigscience-openrail-m
---
|
Aashi/Q_and_A_Google_devices | ---
license: mit
task_categories:
- text-classification
language:
- en
tags:
- code
- tech
pretty_name: A2GD (All About Google Devices)
size_categories:
- n<1K
--- |
Norarolalora/ainzedamanga | ---
license: openrail
---
|
Ekhlass/flutter_docs_1 | ---
license: apache-2.0
---
|
dmrau/cqadupstack-physics-qrels | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 50809
num_examples: 1933
download_size: 25022
dataset_size: 50809
---
# Dataset Card for "cqadupstack-physics-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
banking77 | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- intent-classification
- multi-class-classification
pretty_name: BANKING77
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': activate_my_card
'1': age_limit
'2': apple_pay_or_google_pay
'3': atm_support
'4': automatic_top_up
'5': balance_not_updated_after_bank_transfer
'6': balance_not_updated_after_cheque_or_cash_deposit
'7': beneficiary_not_allowed
'8': cancel_transfer
'9': card_about_to_expire
'10': card_acceptance
'11': card_arrival
'12': card_delivery_estimate
'13': card_linking
'14': card_not_working
'15': card_payment_fee_charged
'16': card_payment_not_recognised
'17': card_payment_wrong_exchange_rate
'18': card_swallowed
'19': cash_withdrawal_charge
'20': cash_withdrawal_not_recognised
'21': change_pin
'22': compromised_card
'23': contactless_not_working
'24': country_support
'25': declined_card_payment
'26': declined_cash_withdrawal
'27': declined_transfer
'28': direct_debit_payment_not_recognised
'29': disposable_card_limits
'30': edit_personal_details
'31': exchange_charge
'32': exchange_rate
'33': exchange_via_app
'34': extra_charge_on_statement
'35': failed_transfer
'36': fiat_currency_support
'37': get_disposable_virtual_card
'38': get_physical_card
'39': getting_spare_card
'40': getting_virtual_card
'41': lost_or_stolen_card
'42': lost_or_stolen_phone
'43': order_physical_card
'44': passcode_forgotten
'45': pending_card_payment
'46': pending_cash_withdrawal
'47': pending_top_up
'48': pending_transfer
'49': pin_blocked
'50': receiving_money
'51': Refund_not_showing_up
'52': request_refund
'53': reverted_card_payment?
'54': supported_cards_and_currencies
'55': terminate_account
'56': top_up_by_bank_transfer_charge
'57': top_up_by_card_charge
'58': top_up_by_cash_or_cheque
'59': top_up_failed
'60': top_up_limits
'61': top_up_reverted
'62': topping_up_by_card
'63': transaction_charged_twice
'64': transfer_fee_charged
'65': transfer_into_account
'66': transfer_not_received_by_recipient
'67': transfer_timing
'68': unable_to_verify_identity
'69': verify_my_identity
'70': verify_source_of_funds
'71': verify_top_up
'72': virtual_card_not_working
'73': visa_or_mastercard
'74': why_verify_identity
'75': wrong_amount_of_cash_received
'76': wrong_exchange_rate_for_cash_withdrawal
splits:
- name: train
num_bytes: 715028
num_examples: 10003
- name: test
num_bytes: 204010
num_examples: 3080
download_size: 392040
dataset_size: 919038
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
train-eval-index:
- config: default
task: text-classification
task_id: multi_class_classification
splits:
train_split: train
eval_split: test
col_mapping:
text: text
label: target
metrics:
- type: accuracy
name: Accuracy
- type: f1
name: F1 macro
args:
average: macro
- type: f1
name: F1 micro
args:
average: micro
- type: f1
name: F1 weighted
args:
average: weighted
- type: precision
name: Precision macro
args:
average: macro
- type: precision
name: Precision micro
args:
average: micro
- type: precision
name: Precision weighted
args:
average: weighted
- type: recall
name: Recall macro
args:
average: macro
- type: recall
name: Recall micro
args:
average: micro
- type: recall
name: Recall weighted
args:
average: weighted
---
# Dataset Card for BANKING77
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Github](https://github.com/PolyAI-LDN/task-specific-datasets)
- **Repository:** [Github](https://github.com/PolyAI-LDN/task-specific-datasets)
- **Paper:** [ArXiv](https://arxiv.org/abs/2003.04807)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
<div class="course-tip course-tip-orange bg-gradient-to-br dark:bg-gradient-to-r before:border-orange-500 dark:before:border-orange-800 from-orange-50 dark:from-gray-900 to-white dark:to-gray-950 border border-orange-50 text-orange-700 dark:text-gray-400">
<p><b>Deprecated:</b> Dataset "banking77" is deprecated and will be deleted. Use "<a href="https://huggingface.co/datasets/PolyAI/banking77">PolyAI/banking77</a>" instead.</p>
</div>
Dataset composed of online banking queries annotated with their corresponding intents.
BANKING77 dataset provides a very fine-grained set of intents in a banking domain.
It comprises 13,083 customer service queries labeled with 77 intents.
It focuses on fine-grained single-domain intent detection.
### Supported Tasks and Leaderboards
Intent classification, intent detection
### Languages
English
## Dataset Structure
### Data Instances
An example of 'train' looks as follows:
```
{
'label': 11, # integer label corresponding to "card_arrival" intent
'text': 'I am still waiting on my card?'
}
```
### Data Fields
- `text`: a string feature.
- `label`: One of classification labels (0-76) corresponding to unique intents.
Intent names are mapped to `label` in the following way:
| label | intent (category) |
|---:|:-------------------------------------------------|
| 0 | activate_my_card |
| 1 | age_limit |
| 2 | apple_pay_or_google_pay |
| 3 | atm_support |
| 4 | automatic_top_up |
| 5 | balance_not_updated_after_bank_transfer |
| 6 | balance_not_updated_after_cheque_or_cash_deposit |
| 7 | beneficiary_not_allowed |
| 8 | cancel_transfer |
| 9 | card_about_to_expire |
| 10 | card_acceptance |
| 11 | card_arrival |
| 12 | card_delivery_estimate |
| 13 | card_linking |
| 14 | card_not_working |
| 15 | card_payment_fee_charged |
| 16 | card_payment_not_recognised |
| 17 | card_payment_wrong_exchange_rate |
| 18 | card_swallowed |
| 19 | cash_withdrawal_charge |
| 20 | cash_withdrawal_not_recognised |
| 21 | change_pin |
| 22 | compromised_card |
| 23 | contactless_not_working |
| 24 | country_support |
| 25 | declined_card_payment |
| 26 | declined_cash_withdrawal |
| 27 | declined_transfer |
| 28 | direct_debit_payment_not_recognised |
| 29 | disposable_card_limits |
| 30 | edit_personal_details |
| 31 | exchange_charge |
| 32 | exchange_rate |
| 33 | exchange_via_app |
| 34 | extra_charge_on_statement |
| 35 | failed_transfer |
| 36 | fiat_currency_support |
| 37 | get_disposable_virtual_card |
| 38 | get_physical_card |
| 39 | getting_spare_card |
| 40 | getting_virtual_card |
| 41 | lost_or_stolen_card |
| 42 | lost_or_stolen_phone |
| 43 | order_physical_card |
| 44 | passcode_forgotten |
| 45 | pending_card_payment |
| 46 | pending_cash_withdrawal |
| 47 | pending_top_up |
| 48 | pending_transfer |
| 49 | pin_blocked |
| 50 | receiving_money |
| 51 | Refund_not_showing_up |
| 52 | request_refund |
| 53 | reverted_card_payment? |
| 54 | supported_cards_and_currencies |
| 55 | terminate_account |
| 56 | top_up_by_bank_transfer_charge |
| 57 | top_up_by_card_charge |
| 58 | top_up_by_cash_or_cheque |
| 59 | top_up_failed |
| 60 | top_up_limits |
| 61 | top_up_reverted |
| 62 | topping_up_by_card |
| 63 | transaction_charged_twice |
| 64 | transfer_fee_charged |
| 65 | transfer_into_account |
| 66 | transfer_not_received_by_recipient |
| 67 | transfer_timing |
| 68 | unable_to_verify_identity |
| 69 | verify_my_identity |
| 70 | verify_source_of_funds |
| 71 | verify_top_up |
| 72 | virtual_card_not_working |
| 73 | visa_or_mastercard |
| 74 | why_verify_identity |
| 75 | wrong_amount_of_cash_received |
| 76 | wrong_exchange_rate_for_cash_withdrawal |
### Data Splits
| Dataset statistics | Train | Test |
| --- | --- | --- |
| Number of examples | 10 003 | 3 080 |
| Average character length | 59.5 | 54.2 |
| Number of intents | 77 | 77 |
| Number of domains | 1 | 1 |
## Dataset Creation
### Curation Rationale
Previous intent detection datasets such as Web Apps, Ask Ubuntu, the Chatbot Corpus or SNIPS are limited to small number of classes (<10), which oversimplifies the intent detection task and does not emulate the true environment of commercial systems. Although there exist large scale *multi-domain* datasets ([HWU64](https://github.com/xliuhw/NLU-Evaluation-Data) and [CLINC150](https://github.com/clinc/oos-eval)), the examples per each domain may not sufficiently capture the full complexity of each domain as encountered "in the wild". This dataset tries to fill the gap and provides a very fine-grained set of intents in a *single-domain* i.e. **banking**. Its focus on fine-grained single-domain intent detection makes it complementary to the other two multi-domain datasets.
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
The dataset does not contain any additional annotations.
#### Who are the annotators?
[N/A]
### Personal and Sensitive Information
[N/A]
## Considerations for Using the Data
### Social Impact of Dataset
The purpose of this dataset it to help develop better intent detection systems.
Any comprehensive intent detection evaluation should involve both coarser-grained multi-domain datasets and a fine-grained single-domain dataset such as BANKING77.
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[PolyAI](https://github.com/PolyAI-LDN)
### Licensing Information
Creative Commons Attribution 4.0 International
### Citation Information
```
@inproceedings{Casanueva2020,
author = {I{\~{n}}igo Casanueva and Tadas Temcinas and Daniela Gerz and Matthew Henderson and Ivan Vulic},
title = {Efficient Intent Detection with Dual Sentence Encoders},
year = {2020},
month = {mar},
note = {Data available at https://github.com/PolyAI-LDN/task-specific-datasets},
url = {https://arxiv.org/abs/2003.04807},
booktitle = {Proceedings of the 2nd Workshop on NLP for ConvAI - ACL 2020}
}
```
### Contributions
Thanks to [@dkajtoch](https://github.com/dkajtoch) for adding this dataset. |
open-llm-leaderboard/details_luffycodes__llama-shishya-7b-ep3-v1 | ---
pretty_name: Evaluation run of luffycodes/llama-shishya-7b-ep3-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [luffycodes/llama-shishya-7b-ep3-v1](https://huggingface.co/luffycodes/llama-shishya-7b-ep3-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_luffycodes__llama-shishya-7b-ep3-v1_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-09T12:48:08.068028](https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__llama-shishya-7b-ep3-v1_public/blob/main/results_2023-11-09T12-48-08.068028.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4594923428252717,\n\
\ \"acc_stderr\": 0.03404628674654547,\n \"acc_norm\": 0.46668909375227274,\n\
\ \"acc_norm_stderr\": 0.03497039082366745,\n \"mc1\": 0.204406364749082,\n\
\ \"mc1_stderr\": 0.014117174337432616,\n \"mc2\": 0.3089869590457097,\n\
\ \"mc2_stderr\": 0.013843169413571187,\n \"em\": 0.3115562080536913,\n\
\ \"em_stderr\": 0.004742879599828378,\n \"f1\": 0.3699653942953032,\n\
\ \"f1_stderr\": 0.004671420668393907\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.45307167235494883,\n \"acc_stderr\": 0.01454689205200563,\n\
\ \"acc_norm\": 0.4803754266211604,\n \"acc_norm_stderr\": 0.014600132075947092\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5934076877116112,\n\
\ \"acc_stderr\": 0.00490193651154613,\n \"acc_norm\": 0.7662816172077276,\n\
\ \"acc_norm_stderr\": 0.004223302177263009\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.45,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.49433962264150944,\n \"acc_stderr\": 0.030770900763851302,\n\
\ \"acc_norm\": 0.49433962264150944,\n \"acc_norm_stderr\": 0.030770900763851302\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n\
\ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.4583333333333333,\n\
\ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3988439306358382,\n\
\ \"acc_stderr\": 0.03733626655383509,\n \"acc_norm\": 0.3988439306358382,\n\
\ \"acc_norm_stderr\": 0.03733626655383509\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.03196758697835362,\n\
\ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.03196758697835362\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.023636975996101806,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.023636975996101806\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\
\ \"acc_stderr\": 0.03567016675276864,\n \"acc_norm\": 0.1984126984126984,\n\
\ \"acc_norm_stderr\": 0.03567016675276864\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.535483870967742,\n\
\ \"acc_stderr\": 0.02837228779796293,\n \"acc_norm\": 0.535483870967742,\n\
\ \"acc_norm_stderr\": 0.02837228779796293\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35960591133004927,\n \"acc_stderr\": 0.03376458246509567,\n\
\ \"acc_norm\": 0.35960591133004927,\n \"acc_norm_stderr\": 0.03376458246509567\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.037425970438065864,\n\
\ \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.037425970438065864\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5454545454545454,\n \"acc_stderr\": 0.03547601494006937,\n \"\
acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.03547601494006937\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6735751295336787,\n \"acc_stderr\": 0.033840286211432945,\n\
\ \"acc_norm\": 0.6735751295336787,\n \"acc_norm_stderr\": 0.033840286211432945\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.38974358974358975,\n \"acc_stderr\": 0.024726967886647078,\n\
\ \"acc_norm\": 0.38974358974358975,\n \"acc_norm_stderr\": 0.024726967886647078\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.03196876989195778,\n \
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.03196876989195778\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6440366972477064,\n \"acc_stderr\": 0.020528559278244214,\n \"\
acc_norm\": 0.6440366972477064,\n \"acc_norm_stderr\": 0.020528559278244214\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2916666666666667,\n \"acc_stderr\": 0.03099866630456053,\n \"\
acc_norm\": 0.2916666666666667,\n \"acc_norm_stderr\": 0.03099866630456053\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6029411764705882,\n \"acc_stderr\": 0.0343413116471913,\n \"acc_norm\"\
: 0.6029411764705882,\n \"acc_norm_stderr\": 0.0343413116471913\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.6455696202531646,\n \"acc_stderr\": 0.031137304297185815,\n \"\
acc_norm\": 0.6455696202531646,\n \"acc_norm_stderr\": 0.031137304297185815\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5426008968609866,\n\
\ \"acc_stderr\": 0.033435777055830646,\n \"acc_norm\": 0.5426008968609866,\n\
\ \"acc_norm_stderr\": 0.033435777055830646\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n\
\ \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.5092592592592593,\n\
\ \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.49693251533742333,\n \"acc_stderr\": 0.03928297078179663,\n\
\ \"acc_norm\": 0.49693251533742333,\n \"acc_norm_stderr\": 0.03928297078179663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n\
\ \"acc_stderr\": 0.04007341809755806,\n \"acc_norm\": 0.23214285714285715,\n\
\ \"acc_norm_stderr\": 0.04007341809755806\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.04689765937278135,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.04689765937278135\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7307692307692307,\n\
\ \"acc_stderr\": 0.029058588303748842,\n \"acc_norm\": 0.7307692307692307,\n\
\ \"acc_norm_stderr\": 0.029058588303748842\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6883780332056194,\n\
\ \"acc_stderr\": 0.016562433867284176,\n \"acc_norm\": 0.6883780332056194,\n\
\ \"acc_norm_stderr\": 0.016562433867284176\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.49421965317919075,\n \"acc_stderr\": 0.02691729617914911,\n\
\ \"acc_norm\": 0.49421965317919075,\n \"acc_norm_stderr\": 0.02691729617914911\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2536312849162011,\n\
\ \"acc_stderr\": 0.014551553659369922,\n \"acc_norm\": 0.2536312849162011,\n\
\ \"acc_norm_stderr\": 0.014551553659369922\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5130718954248366,\n \"acc_stderr\": 0.028620130800700246,\n\
\ \"acc_norm\": 0.5130718954248366,\n \"acc_norm_stderr\": 0.028620130800700246\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5594855305466238,\n\
\ \"acc_stderr\": 0.02819640057419743,\n \"acc_norm\": 0.5594855305466238,\n\
\ \"acc_norm_stderr\": 0.02819640057419743\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5308641975308642,\n \"acc_stderr\": 0.027767689606833932,\n\
\ \"acc_norm\": 0.5308641975308642,\n \"acc_norm_stderr\": 0.027767689606833932\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.34397163120567376,\n \"acc_stderr\": 0.028338017428611327,\n \
\ \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.028338017428611327\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.31877444589308995,\n\
\ \"acc_stderr\": 0.011901895635786097,\n \"acc_norm\": 0.31877444589308995,\n\
\ \"acc_norm_stderr\": 0.011901895635786097\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4264705882352941,\n \"acc_stderr\": 0.030042615832714878,\n\
\ \"acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.030042615832714878\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4493464052287582,\n \"acc_stderr\": 0.020123766528027266,\n \
\ \"acc_norm\": 0.4493464052287582,\n \"acc_norm_stderr\": 0.020123766528027266\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4727272727272727,\n\
\ \"acc_stderr\": 0.04782001791380063,\n \"acc_norm\": 0.4727272727272727,\n\
\ \"acc_norm_stderr\": 0.04782001791380063\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5265306122448979,\n \"acc_stderr\": 0.03196412734523272,\n\
\ \"acc_norm\": 0.5265306122448979,\n \"acc_norm_stderr\": 0.03196412734523272\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6119402985074627,\n\
\ \"acc_stderr\": 0.034457899643627506,\n \"acc_norm\": 0.6119402985074627,\n\
\ \"acc_norm_stderr\": 0.034457899643627506\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.038194861407583984,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.038194861407583984\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.672514619883041,\n \"acc_stderr\": 0.035993357714560276,\n\
\ \"acc_norm\": 0.672514619883041,\n \"acc_norm_stderr\": 0.035993357714560276\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.204406364749082,\n\
\ \"mc1_stderr\": 0.014117174337432616,\n \"mc2\": 0.3089869590457097,\n\
\ \"mc2_stderr\": 0.013843169413571187\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6945540647198106,\n \"acc_stderr\": 0.012945038632552022\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.3115562080536913,\n \
\ \"em_stderr\": 0.004742879599828378,\n \"f1\": 0.3699653942953032,\n \
\ \"f1_stderr\": 0.004671420668393907\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/luffycodes/llama-shishya-7b-ep3-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|arc:challenge|25_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|drop|3_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|gsm8k|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hellaswag|10_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T12-48-08.068028.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-09T12-48-08.068028.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- '**/details_harness|winogrande|5_2023-11-09T12-48-08.068028.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-09T12-48-08.068028.parquet'
- config_name: results
data_files:
- split: 2023_11_09T12_48_08.068028
path:
- results_2023-11-09T12-48-08.068028.parquet
- split: latest
path:
- results_2023-11-09T12-48-08.068028.parquet
---
# Dataset Card for Evaluation run of luffycodes/llama-shishya-7b-ep3-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/luffycodes/llama-shishya-7b-ep3-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [luffycodes/llama-shishya-7b-ep3-v1](https://huggingface.co/luffycodes/llama-shishya-7b-ep3-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_luffycodes__llama-shishya-7b-ep3-v1_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-09T12:48:08.068028](https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__llama-shishya-7b-ep3-v1_public/blob/main/results_2023-11-09T12-48-08.068028.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4594923428252717,
"acc_stderr": 0.03404628674654547,
"acc_norm": 0.46668909375227274,
"acc_norm_stderr": 0.03497039082366745,
"mc1": 0.204406364749082,
"mc1_stderr": 0.014117174337432616,
"mc2": 0.3089869590457097,
"mc2_stderr": 0.013843169413571187,
"em": 0.3115562080536913,
"em_stderr": 0.004742879599828378,
"f1": 0.3699653942953032,
"f1_stderr": 0.004671420668393907
},
"harness|arc:challenge|25": {
"acc": 0.45307167235494883,
"acc_stderr": 0.01454689205200563,
"acc_norm": 0.4803754266211604,
"acc_norm_stderr": 0.014600132075947092
},
"harness|hellaswag|10": {
"acc": 0.5934076877116112,
"acc_stderr": 0.00490193651154613,
"acc_norm": 0.7662816172077276,
"acc_norm_stderr": 0.004223302177263009
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.49433962264150944,
"acc_stderr": 0.030770900763851302,
"acc_norm": 0.49433962264150944,
"acc_norm_stderr": 0.030770900763851302
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.03733626655383509,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.03733626655383509
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.03196758697835362,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.03196758697835362
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.023636975996101806,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.023636975996101806
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276864,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276864
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.535483870967742,
"acc_stderr": 0.02837228779796293,
"acc_norm": 0.535483870967742,
"acc_norm_stderr": 0.02837228779796293
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35960591133004927,
"acc_stderr": 0.03376458246509567,
"acc_norm": 0.35960591133004927,
"acc_norm_stderr": 0.03376458246509567
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.037425970438065864,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.037425970438065864
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.03547601494006937,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.03547601494006937
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6735751295336787,
"acc_stderr": 0.033840286211432945,
"acc_norm": 0.6735751295336787,
"acc_norm_stderr": 0.033840286211432945
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.38974358974358975,
"acc_stderr": 0.024726967886647078,
"acc_norm": 0.38974358974358975,
"acc_norm_stderr": 0.024726967886647078
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.03196876989195778,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.03196876989195778
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6440366972477064,
"acc_stderr": 0.020528559278244214,
"acc_norm": 0.6440366972477064,
"acc_norm_stderr": 0.020528559278244214
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.03099866630456053,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.03099866630456053
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.0343413116471913,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.0343413116471913
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6455696202531646,
"acc_stderr": 0.031137304297185815,
"acc_norm": 0.6455696202531646,
"acc_norm_stderr": 0.031137304297185815
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5426008968609866,
"acc_stderr": 0.033435777055830646,
"acc_norm": 0.5426008968609866,
"acc_norm_stderr": 0.033435777055830646
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49693251533742333,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.49693251533742333,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755806,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755806
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.04689765937278135,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.04689765937278135
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7307692307692307,
"acc_stderr": 0.029058588303748842,
"acc_norm": 0.7307692307692307,
"acc_norm_stderr": 0.029058588303748842
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6883780332056194,
"acc_stderr": 0.016562433867284176,
"acc_norm": 0.6883780332056194,
"acc_norm_stderr": 0.016562433867284176
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.49421965317919075,
"acc_stderr": 0.02691729617914911,
"acc_norm": 0.49421965317919075,
"acc_norm_stderr": 0.02691729617914911
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2536312849162011,
"acc_stderr": 0.014551553659369922,
"acc_norm": 0.2536312849162011,
"acc_norm_stderr": 0.014551553659369922
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5130718954248366,
"acc_stderr": 0.028620130800700246,
"acc_norm": 0.5130718954248366,
"acc_norm_stderr": 0.028620130800700246
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5594855305466238,
"acc_stderr": 0.02819640057419743,
"acc_norm": 0.5594855305466238,
"acc_norm_stderr": 0.02819640057419743
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5308641975308642,
"acc_stderr": 0.027767689606833932,
"acc_norm": 0.5308641975308642,
"acc_norm_stderr": 0.027767689606833932
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.34397163120567376,
"acc_stderr": 0.028338017428611327,
"acc_norm": 0.34397163120567376,
"acc_norm_stderr": 0.028338017428611327
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.31877444589308995,
"acc_stderr": 0.011901895635786097,
"acc_norm": 0.31877444589308995,
"acc_norm_stderr": 0.011901895635786097
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4264705882352941,
"acc_stderr": 0.030042615832714878,
"acc_norm": 0.4264705882352941,
"acc_norm_stderr": 0.030042615832714878
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4493464052287582,
"acc_stderr": 0.020123766528027266,
"acc_norm": 0.4493464052287582,
"acc_norm_stderr": 0.020123766528027266
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.04782001791380063,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.04782001791380063
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5265306122448979,
"acc_stderr": 0.03196412734523272,
"acc_norm": 0.5265306122448979,
"acc_norm_stderr": 0.03196412734523272
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6119402985074627,
"acc_stderr": 0.034457899643627506,
"acc_norm": 0.6119402985074627,
"acc_norm_stderr": 0.034457899643627506
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.038194861407583984,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.038194861407583984
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.672514619883041,
"acc_stderr": 0.035993357714560276,
"acc_norm": 0.672514619883041,
"acc_norm_stderr": 0.035993357714560276
},
"harness|truthfulqa:mc|0": {
"mc1": 0.204406364749082,
"mc1_stderr": 0.014117174337432616,
"mc2": 0.3089869590457097,
"mc2_stderr": 0.013843169413571187
},
"harness|winogrande|5": {
"acc": 0.6945540647198106,
"acc_stderr": 0.012945038632552022
},
"harness|drop|3": {
"em": 0.3115562080536913,
"em_stderr": 0.004742879599828378,
"f1": 0.3699653942953032,
"f1_stderr": 0.004671420668393907
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.