datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
malucoelhaofc/GeraldBroflovskV2 | ---
license: openrail
---
|
CyberHarem/ifrit_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ifrit/イフリータ/伊芙利特 (Arknights)
This is the dataset of ifrit/イフリータ/伊芙利特 (Arknights), containing 500 images and their tags.
The core tags of this character are `horns, blonde_hair, orange_eyes, twintails, demon_horns, short_hair, short_twintails`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 797.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ifrit_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 390.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ifrit_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1196 | 832.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ifrit_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 673.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ifrit_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1196 | 1.25 GiB | [Download](https://huggingface.co/datasets/CyberHarem/ifrit_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ifrit_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 23 |  |  |  |  |  | 1girl, black_footwear, solo, full_body, toeless_footwear, thigh_strap, looking_at_viewer, toenail_polish, black_choker, smile, holding_weapon, standing, short_dress, black_dress, fire, orange_nails, sandals, oripathy_lesion_(arknights), striped_dress, open_mouth, simple_background, white_background, low_twintails, originium_arts_(arknights), rhine_lab_logo |
| 1 | 9 |  |  |  |  |  | 1girl, black_choker, black_dress, cowboy_shot, fire, looking_at_viewer, solo, short_dress, vertical-striped_dress, holding_weapon, nail_polish, thigh_strap, breasts, simple_background, smile, vertical-striped_clothes, originium_arts_(arknights), tail, white_background |
| 2 | 6 |  |  |  |  |  | 1girl, black_choker, black_dress, solo, vertical-striped_dress, looking_at_viewer, upper_body, vertical-striped_clothes, fire, open_mouth, simple_background, smile, white_background, white_jacket, cowboy_shot, originium_arts_(arknights) |
| 3 | 23 |  |  |  |  |  | 1girl, looking_at_viewer, solo, black_choker, upper_body, simple_background, dress, white_background |
| 4 | 5 |  |  |  |  |  | 1girl, black_choker, simple_background, solo, collarbone, looking_at_viewer, shirt, smile, upper_body, white_background, low_twintails, open_mouth, blush, fire |
| 5 | 13 |  |  |  |  |  | 1girl, black_choker, long_sleeves, open_jacket, solo, long_hair, black_jacket, black_shorts, simple_background, alternate_costume, cowboy_shot, looking_at_viewer, tail, white_background, thigh_strap, shirt, smile, sports_bra, brown_jacket, crop_top, open_mouth, short_shorts |
| 6 | 29 |  |  |  |  |  | bare_shoulders, official_alternate_costume, white_bikini, 1girl, solo, looking_at_viewer, navel, stomach, oripathy_lesion_(arknights), smile, bare_arms, small_breasts, nail_polish, holding, orange_choker, collarbone, single_hair_bun, sarashi, standing, arm_strap, orange_nails, tail, upper_body |
| 7 | 5 |  |  |  |  |  | 1girl, alternate_costume, kneehighs, pleated_skirt, solo, black_skirt, looking_at_viewer, simple_background, white_shirt, white_socks, black_footwear, black_sailor_collar, full_body, loafers, serafuku, short_sleeves, white_background, long_sleeves, low_twintails, sitting, standing |
| 8 | 6 |  |  |  |  |  | 1boy, 1girl, blush, hetero, solo_focus, bar_censor, penis, pussy, bottomless, navel, nipples, sex, small_breasts, vaginal |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_footwear | solo | full_body | toeless_footwear | thigh_strap | looking_at_viewer | toenail_polish | black_choker | smile | holding_weapon | standing | short_dress | black_dress | fire | orange_nails | sandals | oripathy_lesion_(arknights) | striped_dress | open_mouth | simple_background | white_background | low_twintails | originium_arts_(arknights) | rhine_lab_logo | cowboy_shot | vertical-striped_dress | nail_polish | breasts | vertical-striped_clothes | tail | upper_body | white_jacket | dress | collarbone | shirt | blush | long_sleeves | open_jacket | long_hair | black_jacket | black_shorts | alternate_costume | sports_bra | brown_jacket | crop_top | short_shorts | bare_shoulders | official_alternate_costume | white_bikini | navel | stomach | bare_arms | small_breasts | holding | orange_choker | single_hair_bun | sarashi | arm_strap | kneehighs | pleated_skirt | black_skirt | white_shirt | white_socks | black_sailor_collar | loafers | serafuku | short_sleeves | sitting | 1boy | hetero | solo_focus | bar_censor | penis | pussy | bottomless | nipples | sex | vaginal |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------|:------------|:-------------------|:--------------|:--------------------|:-----------------|:---------------|:--------|:-----------------|:-----------|:--------------|:--------------|:-------|:---------------|:----------|:------------------------------|:----------------|:-------------|:--------------------|:-------------------|:----------------|:-----------------------------|:-----------------|:--------------|:-------------------------|:--------------|:----------|:---------------------------|:-------|:-------------|:---------------|:--------|:-------------|:--------|:--------|:---------------|:--------------|:------------|:---------------|:---------------|:--------------------|:-------------|:---------------|:-----------|:---------------|:-----------------|:-----------------------------|:---------------|:--------|:----------|:------------|:----------------|:----------|:----------------|:------------------|:----------|:------------|:------------|:----------------|:--------------|:--------------|:--------------|:----------------------|:----------|:-----------|:----------------|:----------|:-------|:---------|:-------------|:-------------|:--------|:--------|:-------------|:----------|:------|:----------|
| 0 | 23 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | | X | | | X | X | | X | X | X | | X | X | X | | | | | | X | X | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | X | | | | X | | X | X | | | | X | X | | | | | X | X | X | | X | | X | X | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 23 |  |  |  |  |  | X | | X | | | | X | | X | | | | | | | | | | | | X | X | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | X | | | | X | | X | X | | | | | X | | | | | X | X | X | X | | | | | | | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 13 |  |  |  |  |  | X | | X | | | X | X | | X | X | | | | | | | | | | X | X | X | | | | X | | | | | X | | | | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 29 |  |  |  |  |  | X | | X | | | | X | | | X | | X | | | | X | | X | | | | | | | | | | X | | | X | X | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | X | X | X | | | X | | | | | X | | | | | | | | | X | X | X | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
basil2kk4/oke | ---
license: apache-2.0
---
|
damerajee/eval_indic-qa | ---
dataset_info:
features:
- name: Input Text
dtype: string
- name: Actual answer
dtype: string
- name: Model answer
dtype: string
- name: Recall
dtype: float64
- name: Precision
dtype: float64
- name: F1 Score
dtype: float64
splits:
- name: train
num_bytes: 338225
num_examples: 50
download_size: 147271
dataset_size: 338225
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_jan-hq__stealth-rag-v1.1 | ---
pretty_name: Evaluation run of jan-hq/stealth-rag-v1.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jan-hq/stealth-rag-v1.1](https://huggingface.co/jan-hq/stealth-rag-v1.1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jan-hq__stealth-rag-v1.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T21:37:07.649843](https://huggingface.co/datasets/open-llm-leaderboard/details_jan-hq__stealth-rag-v1.1/blob/main/results_2024-02-09T21-37-07.649843.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.642701044613855,\n\
\ \"acc_stderr\": 0.032067149680735214,\n \"acc_norm\": 0.6436584541939985,\n\
\ \"acc_norm_stderr\": 0.03271996389337109,\n \"mc1\": 0.34149326805385555,\n\
\ \"mc1_stderr\": 0.01660068861995083,\n \"mc2\": 0.49642217442112185,\n\
\ \"mc2_stderr\": 0.015181105379233154\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5836177474402731,\n \"acc_stderr\": 0.014405618279436174,\n\
\ \"acc_norm\": 0.621160409556314,\n \"acc_norm_stderr\": 0.014175915490000328\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6337382991435969,\n\
\ \"acc_stderr\": 0.004807975515446489,\n \"acc_norm\": 0.8382792272455686,\n\
\ \"acc_norm_stderr\": 0.0036744197993536687\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.037150621549989056,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.037150621549989056\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n\
\ \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \
\ \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.0372424959581773,\n\
\ \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.0372424959581773\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n\
\ \"acc_stderr\": 0.04755129616062946,\n \"acc_norm\": 0.35294117647058826,\n\
\ \"acc_norm_stderr\": 0.04755129616062946\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n\
\ \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n\
\ \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n\
\ \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5517241379310345,\n \"acc_stderr\": 0.041443118108781526,\n \"\
acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.041443118108781526\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"\
acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959215,\n \"\
acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959215\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566545,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566545\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431378,\n \"\
acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431378\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676173,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676173\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n\
\ \"acc_stderr\": 0.030360379710291957,\n \"acc_norm\": 0.7130044843049327,\n\
\ \"acc_norm_stderr\": 0.030360379710291957\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097652,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097652\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368985,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368985\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3396648044692737,\n\
\ \"acc_stderr\": 0.01583940040621249,\n \"acc_norm\": 0.3396648044692737,\n\
\ \"acc_norm_stderr\": 0.01583940040621249\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959607,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959607\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4485006518904824,\n\
\ \"acc_stderr\": 0.012702317490559802,\n \"acc_norm\": 0.4485006518904824,\n\
\ \"acc_norm_stderr\": 0.012702317490559802\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6633986928104575,\n \"acc_stderr\": 0.01911721391149515,\n \
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.01911721391149515\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827072,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827072\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061452,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061452\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34149326805385555,\n\
\ \"mc1_stderr\": 0.01660068861995083,\n \"mc2\": 0.49642217442112185,\n\
\ \"mc2_stderr\": 0.015181105379233154\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7932123125493291,\n \"acc_stderr\": 0.011382566829235803\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6777862016679302,\n \
\ \"acc_stderr\": 0.012872435481188776\n }\n}\n```"
repo_url: https://huggingface.co/jan-hq/stealth-rag-v1.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|arc:challenge|25_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|gsm8k|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hellaswag|10_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T21-37-07.649843.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T21-37-07.649843.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- '**/details_harness|winogrande|5_2024-02-09T21-37-07.649843.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T21-37-07.649843.parquet'
- config_name: results
data_files:
- split: 2024_02_09T21_37_07.649843
path:
- results_2024-02-09T21-37-07.649843.parquet
- split: latest
path:
- results_2024-02-09T21-37-07.649843.parquet
---
# Dataset Card for Evaluation run of jan-hq/stealth-rag-v1.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jan-hq/stealth-rag-v1.1](https://huggingface.co/jan-hq/stealth-rag-v1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jan-hq__stealth-rag-v1.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T21:37:07.649843](https://huggingface.co/datasets/open-llm-leaderboard/details_jan-hq__stealth-rag-v1.1/blob/main/results_2024-02-09T21-37-07.649843.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.642701044613855,
"acc_stderr": 0.032067149680735214,
"acc_norm": 0.6436584541939985,
"acc_norm_stderr": 0.03271996389337109,
"mc1": 0.34149326805385555,
"mc1_stderr": 0.01660068861995083,
"mc2": 0.49642217442112185,
"mc2_stderr": 0.015181105379233154
},
"harness|arc:challenge|25": {
"acc": 0.5836177474402731,
"acc_stderr": 0.014405618279436174,
"acc_norm": 0.621160409556314,
"acc_norm_stderr": 0.014175915490000328
},
"harness|hellaswag|10": {
"acc": 0.6337382991435969,
"acc_stderr": 0.004807975515446489,
"acc_norm": 0.8382792272455686,
"acc_norm_stderr": 0.0036744197993536687
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.037150621549989056,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.037150621549989056
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.041443118108781526,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.041443118108781526
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.016332882393431378,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.016332882393431378
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676173,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.030360379710291957,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.030360379710291957
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097652,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097652
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368985,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368985
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3396648044692737,
"acc_stderr": 0.01583940040621249,
"acc_norm": 0.3396648044692737,
"acc_norm_stderr": 0.01583940040621249
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959607,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959607
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4485006518904824,
"acc_stderr": 0.012702317490559802,
"acc_norm": 0.4485006518904824,
"acc_norm_stderr": 0.012702317490559802
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.01911721391149515,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.01911721391149515
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827072,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827072
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061452,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061452
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34149326805385555,
"mc1_stderr": 0.01660068861995083,
"mc2": 0.49642217442112185,
"mc2_stderr": 0.015181105379233154
},
"harness|winogrande|5": {
"acc": 0.7932123125493291,
"acc_stderr": 0.011382566829235803
},
"harness|gsm8k|5": {
"acc": 0.6777862016679302,
"acc_stderr": 0.012872435481188776
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Kentaline/hf-dataset-study | ---
license: other
---
---
annotations_creators:
- crowdsourced
language:
- ja
language_creators:
- crowdsourced
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
paperswithcode_id: squad
pretty_name: squad-ja
size_categories:
- 100K<n<1M
source_datasets:
- original
tags: []
task_categories:
- question-answering
task_ids:
- open-domain-qa
- extractive-qa
train-eval-index:
- col_mapping:
answers:
answer_start: answer_start
text: text
context: context
question: question
config: squad_v2
metrics:
- name: SQuAD v2
type: squad_v2
splits:
eval_split: validation
train_split: train
task: question-answering
task_id: extractive_question_answering
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Google翻訳APIで翻訳した日本語版SQuAD2.0
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
Japanese
## Dataset Structure
### Data Instances
```
{
"start": 43,
"end": 88,
"question": "ビヨンセ は いつ から 人気 を 博し 始め ました か ?",
"context": "BeyoncéGiselleKnowles - Carter ( /b i ː ˈ j ɒ nse ɪ / bee - YON - say ) ( 1981 年 9 月 4 日 生まれ ) は 、 アメリカ の シンガー 、 ソング ライター 、 レコード プロデューサー 、 女優 です 。 テキサス 州 ヒューストン で 生まれ育った 彼女 は 、 子供 の 頃 に さまざまな 歌 と 踊り の コンテスト に 出演 し 、 1990 年 代 後半 に R & B ガールグループ Destiny & 39 ; sChild の リード シンガー と して 名声 を 博し ました 。 父親 の マシューノウルズ が 管理 する この グループ は 、 世界 で 最も 売れて いる 少女 グループ の 1 つ に なり ました 。 彼 ら の 休み は ビヨンセ の デビュー アルバム 、 DangerouslyinLove ( 2003 ) の リリース を 見 ました 。 彼女 は 世界 中 で ソロ アーティスト と して 確立 し 、 5 つ の グラミー 賞 を 獲得 し 、 ビル ボード ホット 100 ナンバーワン シングル 「 CrazyinLove 」 と 「 BabyBoy 」 を フィーチャー し ました 。",
"id": "56be85543aeaaa14008c9063"
}
```
### Data Fields
- start
- end
- question
- context
- id
### Data Splits
- train 86820
- valid 5927
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset. |
MLINSEA/Moroccan_ads | ---
dataset_info:
features:
- name: ad
dtype: string
- name: title
dtype: string
- name: link
dtype: string
- name: channel
dtype: string
splits:
- name: train
num_bytes: 1115354
num_examples: 3992
download_size: 366806
dataset_size: 1115354
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Moroccan_ads"
# YouTube Ads Dataset from Moroccan Channels
## Description
This dataset contains advertisements and related information from Moroccan YouTube channels. It's designed to facilitate research in digital marketing, content analysis, and linguistic studies focused on Moroccan Arabic and French.
## Dataset Structure
The dataset consists of 3992 records, each representing an advertisement from YouTube. The data is organized into four columns:
- `ad`: The text of the advertisement.
- `title`: The title of the YouTube video from which the ad was extracted.
- `link`: The URL to the YouTube video.
- `channel`: The identifier of the YouTube channel (e.g., `@orangemaroc`).
## Data Cleaning
Users should be aware that the dataset contains raw data that may need to be cleaned and preprocessed for analysis. This can include removing special characters, correcting typos, or standardizing text format.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dim/tldr_news_3k | ---
dataset_info:
features:
- name: headline
dtype: string
- name: content
dtype: string
- name: category
dtype:
class_label:
names:
'0': Sponsor
'1': Big Tech & Startups
'2': Science and Futuristic Technology
'3': Programming, Design & Data Science
'4': Miscellaneous
splits:
- name: train
num_bytes: 1681328.9436817036
num_examples: 3000
download_size: 1064733
dataset_size: 1681328.9436817036
---
# Dataset Card for "tldr_news_3k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shreyasharma/masked_step_label2 | ---
dataset_info:
features:
- name: step
dtype: string
- name: label
dtype: string
- name: transformed_sentence
dtype: string
- name: token_strs
dtype: string
splits:
- name: train
num_bytes: 2636233
num_examples: 6216
download_size: 432737
dataset_size: 2636233
---
# Dataset Card for "masked_step_label2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
miragepa/ANDROIDEN18 | ---
license: openrail
---
|
sap-ai-research/datasets-for-micse | ---
language:
- en
--- |
open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b-v2 | ---
pretty_name: Evaluation run of indischepartij/OpenMia-Indo-Mistral-7b-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [indischepartij/OpenMia-Indo-Mistral-7b-v2](https://huggingface.co/indischepartij/OpenMia-Indo-Mistral-7b-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T21:08:39.122090](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b-v2/blob/main/results_2024-02-02T21-08-39.122090.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6240607574132118,\n\
\ \"acc_stderr\": 0.032532796626580374,\n \"acc_norm\": 0.6300113550132161,\n\
\ \"acc_norm_stderr\": 0.033195769514987344,\n \"mc1\": 0.3072215422276622,\n\
\ \"mc1_stderr\": 0.016150201321323013,\n \"mc2\": 0.4434739529053457,\n\
\ \"mc2_stderr\": 0.014529702448189592\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5733788395904437,\n \"acc_stderr\": 0.014453185592920293,\n\
\ \"acc_norm\": 0.6032423208191127,\n \"acc_norm_stderr\": 0.014296513020180639\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6270663214499104,\n\
\ \"acc_stderr\": 0.004825963768772224,\n \"acc_norm\": 0.8311093407687712,\n\
\ \"acc_norm_stderr\": 0.0037388962449538122\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.03878139888797611,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.03878139888797611\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n\
\ \"acc_stderr\": 0.024472243840895514,\n \"acc_norm\": 0.7548387096774194,\n\
\ \"acc_norm_stderr\": 0.024472243840895514\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396997,\n\
\ \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396997\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.031429466378837076,\n\
\ \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.031429466378837076\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8073394495412844,\n \"acc_stderr\": 0.016909276884936077,\n \"\
acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.016909276884936077\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.0340763209385405,\n \"acc_norm\"\
: 0.5185185185185185,\n \"acc_norm_stderr\": 0.0340763209385405\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n\
\ \"acc_stderr\": 0.027865942286639318,\n \"acc_norm\": 0.803921568627451,\n\
\ \"acc_norm_stderr\": 0.027865942286639318\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676173,\n\
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676173\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.04738975119274155,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.04738975119274155\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.037601780060266196,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.037601780060266196\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n\
\ \"acc_stderr\": 0.014283378044296418,\n \"acc_norm\": 0.8007662835249042,\n\
\ \"acc_norm_stderr\": 0.014283378044296418\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.024332146779134128,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.024332146779134128\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2782122905027933,\n\
\ \"acc_stderr\": 0.01498732543996355,\n \"acc_norm\": 0.2782122905027933,\n\
\ \"acc_norm_stderr\": 0.01498732543996355\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02609016250427905,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02609016250427905\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984824,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984824\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4426336375488918,\n\
\ \"acc_stderr\": 0.012685906538206244,\n \"acc_norm\": 0.4426336375488918,\n\
\ \"acc_norm_stderr\": 0.012685906538206244\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6552287581699346,\n \"acc_stderr\": 0.01922832201869664,\n \
\ \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.01922832201869664\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197773,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197773\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3072215422276622,\n\
\ \"mc1_stderr\": 0.016150201321323013,\n \"mc2\": 0.4434739529053457,\n\
\ \"mc2_stderr\": 0.014529702448189592\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7829518547750592,\n \"acc_stderr\": 0.011585871710209403\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3479909021986353,\n \
\ \"acc_stderr\": 0.013120581030382132\n }\n}\n```"
repo_url: https://huggingface.co/indischepartij/OpenMia-Indo-Mistral-7b-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|arc:challenge|25_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|gsm8k|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hellaswag|10_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T21-08-39.122090.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T21-08-39.122090.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- '**/details_harness|winogrande|5_2024-02-02T21-08-39.122090.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T21-08-39.122090.parquet'
- config_name: results
data_files:
- split: 2024_02_02T21_08_39.122090
path:
- results_2024-02-02T21-08-39.122090.parquet
- split: latest
path:
- results_2024-02-02T21-08-39.122090.parquet
---
# Dataset Card for Evaluation run of indischepartij/OpenMia-Indo-Mistral-7b-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [indischepartij/OpenMia-Indo-Mistral-7b-v2](https://huggingface.co/indischepartij/OpenMia-Indo-Mistral-7b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T21:08:39.122090](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b-v2/blob/main/results_2024-02-02T21-08-39.122090.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6240607574132118,
"acc_stderr": 0.032532796626580374,
"acc_norm": 0.6300113550132161,
"acc_norm_stderr": 0.033195769514987344,
"mc1": 0.3072215422276622,
"mc1_stderr": 0.016150201321323013,
"mc2": 0.4434739529053457,
"mc2_stderr": 0.014529702448189592
},
"harness|arc:challenge|25": {
"acc": 0.5733788395904437,
"acc_stderr": 0.014453185592920293,
"acc_norm": 0.6032423208191127,
"acc_norm_stderr": 0.014296513020180639
},
"harness|hellaswag|10": {
"acc": 0.6270663214499104,
"acc_stderr": 0.004825963768772224,
"acc_norm": 0.8311093407687712,
"acc_norm_stderr": 0.0037388962449538122
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.03878139888797611,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.03878139888797611
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895514,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895514
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465396997,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465396997
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.031429466378837076,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.031429466378837076
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8073394495412844,
"acc_stderr": 0.016909276884936077,
"acc_norm": 0.8073394495412844,
"acc_norm_stderr": 0.016909276884936077
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.0340763209385405,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.0340763209385405
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676173,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.04738975119274155,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.04738975119274155
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.037601780060266196,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.037601780060266196
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8007662835249042,
"acc_stderr": 0.014283378044296418,
"acc_norm": 0.8007662835249042,
"acc_norm_stderr": 0.014283378044296418
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.024332146779134128,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.024332146779134128
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2782122905027933,
"acc_stderr": 0.01498732543996355,
"acc_norm": 0.2782122905027933,
"acc_norm_stderr": 0.01498732543996355
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02609016250427905,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02609016250427905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984824,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984824
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4426336375488918,
"acc_stderr": 0.012685906538206244,
"acc_norm": 0.4426336375488918,
"acc_norm_stderr": 0.012685906538206244
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.01922832201869664,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.01922832201869664
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197773,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197773
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3072215422276622,
"mc1_stderr": 0.016150201321323013,
"mc2": 0.4434739529053457,
"mc2_stderr": 0.014529702448189592
},
"harness|winogrande|5": {
"acc": 0.7829518547750592,
"acc_stderr": 0.011585871710209403
},
"harness|gsm8k|5": {
"acc": 0.3479909021986353,
"acc_stderr": 0.013120581030382132
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Rico777/hgfyuc | ---
license: unknown
---
|
KETI-AIR/kor_glue | ---
dataset_info:
- config_name: cola
features:
- name: data_index_by_user
dtype: int32
- name: label
dtype: int32
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 569511
num_examples: 8551
- name: validation
num_bytes: 72661
num_examples: 1043
- name: test
num_bytes: 72979
num_examples: 1063
download_size: 381894
dataset_size: 715151
- config_name: mrpc
features:
- name: data_index_by_user
dtype: int32
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int32
- name: idx
dtype: int32
splits:
- name: train
num_bytes: 1078522
num_examples: 3668
- name: validation
num_bytes: 120306
num_examples: 408
- name: test
num_bytes: 504069
num_examples: 1725
download_size: 1176356
dataset_size: 1702897
- config_name: qnli
features:
- name: data_index_by_user
dtype: int32
- name: label
dtype: int32
- name: question
dtype: string
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 28343211
num_examples: 104743
- name: validation
num_bytes: 1507016
num_examples: 5463
- name: test
num_bytes: 1510880
num_examples: 5463
download_size: 21097078
dataset_size: 31361107
- config_name: qqp
features:
- name: data_index_by_user
dtype: int32
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int32
- name: idx
dtype: int32
splits:
- name: train
num_bytes: 64564524
num_examples: 363846
download_size: 40798086
dataset_size: 64564524
- config_name: wnli
features:
- name: data_index_by_user
dtype: int32
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int32
- name: idx
dtype: int32
splits:
- name: train
num_bytes: 132171
num_examples: 635
- name: validation
num_bytes: 15331
num_examples: 71
- name: test
num_bytes: 47430
num_examples: 146
download_size: 80151
dataset_size: 194932
configs:
- config_name: cola
data_files:
- split: train
path: cola/train-*
- split: validation
path: cola/validation-*
- split: test
path: cola/test-*
- config_name: mrpc
data_files:
- split: train
path: mrpc/train-*
- split: validation
path: mrpc/validation-*
- split: test
path: mrpc/test-*
- config_name: qnli
data_files:
- split: train
path: qnli/train-*
- split: validation
path: qnli/validation-*
- split: test
path: qnli/test-*
- config_name: qqp
data_files:
- split: train
path: qqp/train-*
- config_name: wnli
data_files:
- split: train
path: wnli/train-*
- split: validation
path: wnli/validation-*
- split: test
path: wnli/test-*
license: cc-by-4.0
---
# Dataset Card for "kor_glue"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
# Source Data Citation Information
```
@article{warstadt2018neural,
title={Neural Network Acceptability Judgments},
author={Warstadt, Alex and Singh, Amanpreet and Bowman, Samuel R},
journal={arXiv preprint arXiv:1805.12471},
year={2018}
}
@inproceedings{wang2019glue,
title={{GLUE}: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding},
author={Wang, Alex and Singh, Amanpreet and Michael, Julian and Hill, Felix and Levy, Omer and Bowman, Samuel R.},
note={In the Proceedings of ICLR.},
year={2019}
}
Note that each GLUE dataset has its own citation. Please see the source to see
the correct citation for each contained dataset.
``` |
kaushik1064/Arakoo_dataset | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: task
dtype: string
- name: token_count
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 4662439.075110457
num_examples: 1901
- name: test
num_bytes: 1998888.9248895436
num_examples: 815
download_size: 3635391
dataset_size: 6661328.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
PorYoung/Kasugano-Sora | ---
license: mit
---
# 春日野穹(Kasugano Sora) 音声数据集
数据集提取自《缘之空》和《悠之空》,剔除部分不和谐的音声
## 数据集说明
### 缘之空
### 悠之空
### 田口宏子(宫村宫子)歌声
## 免责声明
本项目内容仅供学习交流,严禁用于商业用途和从事其他非法和有违公序良俗的活动,请于24小时内删除! |
Dzeniks/fever-nei-wiki-based | ---
license: mit
---
|
hieuhocnlp/lstm-deep-usc-test | ---
dataset_info:
features:
- name: line
dtype: string
splits:
- name: train
num_bytes: 770852
num_examples: 55043
download_size: 466754
dataset_size: 770852
---
# Dataset Card for "lstm-deep-usc-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chrisgg1/keywords_verbinden3 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: label
dtype:
class_label:
names:
'0': silence
'1': unknown
'2': verbinden
splits:
- name: train
num_bytes: 2065984780.822
num_examples: 46449
download_size: 1404548901
dataset_size: 2065984780.822
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
fathan/autotrain-data-code-mixed-language-identification | ---
task_categories:
- token-classification
---
# AutoTrain Dataset for project: code-mixed-language-identification
## Dataset Description
This dataset has been automatically processed by AutoTrain for project code-mixed-language-identification.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"feat_Unnamed: 0": 1104,
"tokens": [
"@user",
"salah",
"satu",
"dari",
"4",
"anak",
"dr",
"sunardi",
"ada",
"yg",
"berprofesi",
"sbg",
"dokter",
"juga",
",",
"lulusan",
"unair",
",",
"sudah",
"selesai",
"koas",
"dan",
"intern",
"tolong",
"disupport",
"pak",
"anak",
"beliau"
],
"tags": [
6,
1,
1,
1,
6,
1,
6,
6,
1,
1,
1,
1,
1,
1,
6,
1,
6,
6,
1,
1,
1,
1,
0,
1,
3,
1,
1,
1
]
},
{
"feat_Unnamed: 0": 239,
"tokens": [
"@user",
"kamu",
"pake",
"apa",
"toh",
"?",
"aku",
"pake",
"xl",
"banter",
"lho",
"di",
"apartemen",
"pun",
"bisa",
"download",
"yutub"
],
"tags": [
6,
1,
1,
1,
1,
6,
1,
1,
6,
1,
1,
1,
1,
1,
1,
0,
6
]
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"feat_Unnamed: 0": "Value(dtype='int64', id=None)",
"tokens": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)",
"tags": "Sequence(feature=ClassLabel(names=['EN', 'ID', 'JV', 'MIX_ID_EN', 'MIX_ID_JV', 'MIX_JV_EN', 'OTH'], id=None), length=-1, id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 1105 |
| valid | 438 |
|
bzb2023/Zhihu-KOL-More-Than-100-Upvotes | ---
license: apache-2.0
task_categories:
- text-generation
language:
- zh
---
对 https://huggingface.co/datasets/wangrui6/Zhihu-KOL 数据进行了初步整理,保留了100赞及以上的数据。
共271261条。 |
serhatkurt/data_modelGenerated | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 1982154.0
num_examples: 16
download_size: 1983278
dataset_size: 1982154.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data_modelGenerated"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Gt-Doremiti/gt-doremiti-instructions | ---
license: cc-by-4.0
language:
- fr
tags:
- instruction-finetuning
pretty_name: gt-doremiti-instructions
task_categories:
- text-generation
---
# Dataset Card for gt-doremiti-instructions
## Dataset Description
Jeu d'instruction pour fine-tuner un LLM suivant les préconisations du projet Stanford-Alpaca (https://github.com/tatsu-lab/stanford_alpaca)
Ces instructions sont extraites de la FAQ crée par le GT DOREMITI et disponible à cette adresse (https://gt-atelier-donnees.miti.cnrs.fr/faq.html)
Les données sont mise à disposition selon les termes de la Licence Creative Commons Attribution 4.0 International.
|
CyberHarem/ebihara_naho_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ebihara_naho/海老原菜帆 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of ebihara_naho/海老原菜帆 (THE iDOLM@STER: Cinderella Girls), containing 109 images and their tags.
The core tags of this character are `breasts, black_hair, large_breasts, green_eyes, ponytail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 109 | 104.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ebihara_naho_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 109 | 72.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ebihara_naho_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 240 | 143.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ebihara_naho_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 109 | 97.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ebihara_naho_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 240 | 185.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ebihara_naho_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ebihara_naho_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, blush, looking_at_viewer, smile, solo, cleavage, necklace, hair_scrunchie, open_mouth, thighs |
| 1 | 5 |  |  |  |  |  | 1girl, hair_scrunchie, red_bowtie, school_uniform, smile, blush, pleated_skirt, blue_skirt, looking_at_viewer, polka_dot_scrunchie, single_hair_bun, sitting, solo, white_shirt, blue_sweater, cherry_blossoms, closed_mouth, jacket, miniskirt, outdoors, petals |
| 2 | 5 |  |  |  |  |  | 1girl, blush, long_hair, cleavage, demon_tail, heart, looking_at_viewer, smile, black_bikini, bracelet, demon_horns, navel, solo, demon_girl, demon_wings, female_pubic_hair, nail_polish, open_mouth, symbol-shaped_pupils |
| 3 | 8 |  |  |  |  |  | 1girl, 1boy, blush, hetero, mosaic_censoring, solo_focus, brown_hair, female_pubic_hair, nipples, nude, penis, smile, cum_on_breasts, open_mouth, sex, short_hair, breast_grab, cum_in_pussy, grabbing, looking_at_viewer, mixed_bathing, navel, spread_legs, sweat, vaginal |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | looking_at_viewer | smile | solo | cleavage | necklace | hair_scrunchie | open_mouth | thighs | red_bowtie | school_uniform | pleated_skirt | blue_skirt | polka_dot_scrunchie | single_hair_bun | sitting | white_shirt | blue_sweater | cherry_blossoms | closed_mouth | jacket | miniskirt | outdoors | petals | long_hair | demon_tail | heart | black_bikini | bracelet | demon_horns | navel | demon_girl | demon_wings | female_pubic_hair | nail_polish | symbol-shaped_pupils | 1boy | hetero | mosaic_censoring | solo_focus | brown_hair | nipples | nude | penis | cum_on_breasts | sex | short_hair | breast_grab | cum_in_pussy | grabbing | mixed_bathing | spread_legs | sweat | vaginal |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:--------|:-------|:-----------|:-----------|:-----------------|:-------------|:---------|:-------------|:-----------------|:----------------|:-------------|:----------------------|:------------------|:----------|:--------------|:---------------|:------------------|:---------------|:---------|:------------|:-----------|:---------|:------------|:-------------|:--------|:---------------|:-----------|:--------------|:--------|:-------------|:--------------|:--------------------|:--------------|:-----------------------|:-------|:---------|:-------------------|:-------------|:-------------|:----------|:-------|:--------|:-----------------|:------|:-------------|:--------------|:---------------|:-----------|:----------------|:--------------|:--------|:----------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | X | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
anashrivastava/tl-rephrase-hf | ---
dataset_info:
features:
- name: filename
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 173944
num_examples: 1080
download_size: 42250
dataset_size: 173944
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
allenai/wcep_dense_oracle | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- other
multilinguality:
- monolingual
pretty_name: WCEP-10
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- summarization
task_ids:
- news-articles-summarization
paperswithcode_id: wcep
train-eval-index:
- config: default
task: summarization
task_id: summarization
splits:
train_split: train
eval_split: test
col_mapping:
document: text
summary: target
metrics:
- type: rouge
name: Rouge
---
This is a copy of the [WCEP-10](https://huggingface.co/datasets/ccdv/WCEP-10) dataset, except the input source documents of the `train`, `validation`, and `test` splits have been replaced by a __dense__ retriever. The retrieval pipeline used:
- __query__: The `summary` field of each example
- __corpus__: The union of all documents in the `train`, `validation` and `test` splits
- __retriever__: [`facebook/contriever-msmarco`](https://huggingface.co/facebook/contriever-msmarco) via [PyTerrier](https://pyterrier.readthedocs.io/en/latest/) with default settings
- __top-k strategy__: `"oracle"`, i.e. the number of documents retrieved, `k`, is set as the original number of input documents for each example
Retrieval results on the `train` set:
| Recall@100 | Rprec | Precision@k | Recall@k |
| ----------- | ----------- | ----------- | ----------- |
| 0.8590 | 0.6490 | 0.6490 | 0.6490 |
Retrieval results on the `validation` set:
| Recall@100 | Rprec | Precision@k | Recall@k |
| ----------- | ----------- | ----------- | ----------- |
| 0.8578 | 0.6326 | 0.6326 | 0.6326 |
Retrieval results on the `test` set:
| Recall@100 | Rprec | Precision@k | Recall@k |
| ----------- | ----------- | ----------- | ----------- |
| 0.8678 | 0.6631 | 0.6631 | 0.6631 | |
CyberHarem/satou_masuki_bangdream | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of satou_masuki/佐藤ますき (BanG Dream!)
This is the dataset of satou_masuki/佐藤ますき (BanG Dream!), containing 62 images and their tags.
The core tags of this character are `blonde_hair, short_hair, yellow_eyes, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 62 | 57.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/satou_masuki_bangdream/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 62 | 44.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/satou_masuki_bangdream/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 123 | 80.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/satou_masuki_bangdream/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 62 | 54.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/satou_masuki_bangdream/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 123 | 98.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/satou_masuki_bangdream/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/satou_masuki_bangdream',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, solo, upper_body, jacket, shirt, simple_background, white_background |
| 1 | 17 |  |  |  |  |  | 1girl, crop_top, solo, midriff, holding, looking_at_viewer, navel, long_sleeves, black_shirt, drumsticks, fingerless_gloves, open_jacket, open_mouth, smile, breasts, collarbone, drum_set, earrings, red_jacket, simple_background, skirt, v-shaped_eyebrows, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | upper_body | jacket | shirt | simple_background | white_background | crop_top | midriff | holding | navel | long_sleeves | black_shirt | drumsticks | fingerless_gloves | open_jacket | open_mouth | smile | breasts | collarbone | drum_set | earrings | red_jacket | skirt | v-shaped_eyebrows |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-------------|:---------|:--------|:--------------------|:-------------------|:-----------|:----------|:----------|:--------|:---------------|:--------------|:-------------|:--------------------|:--------------|:-------------|:--------|:----------|:-------------|:-----------|:-----------|:-------------|:--------|:--------------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 1 | 17 |  |  |  |  |  | X | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
mmuttharasan/llmjptk1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 81960.0
num_examples: 10
- name: test
num_bytes: 16392.0
num_examples: 2
download_size: 38350
dataset_size: 98352.0
---
# Dataset Card for "llmjptk1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibranze/araproje_arc_en_conf_llama_nearestscore_true_y | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: validation
num_bytes: 80031.0
num_examples: 250
download_size: 46853
dataset_size: 80031.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_arc_en_conf_llama_nearestscore_true_y"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
maidalun1020/CrosslingualRetrievalLawZh2En-qrels | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
dataset_info:
features:
- name: qid
dtype: string
- name: pid
dtype: string
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 668746
num_examples: 27458
download_size: 358333
dataset_size: 668746
---
|
ovior/twitter_dataset_1713004544 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2680040
num_examples: 7878
download_size: 1523804
dataset_size: 2680040
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
awettig/subj | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: label_text
dtype: string
splits:
- name: train
num_bytes: 1231802
num_examples: 8000
- name: test
num_bytes: 310282
num_examples: 2000
download_size: 946189
dataset_size: 1542084
---
# Dataset Card for "subj"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xbsd/xbsd-guanaco-llama2-1k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966693
dataset_size: 1654448
---
# Dataset Card for "xbsd-guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/luftVersorgen-200-undersampled | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': LuftBereitstellen
'1': LuftVerteilen
splits:
- name: train
num_bytes: 79029.72241029113
num_examples: 400
- name: test
num_bytes: 290707
num_examples: 1477
- name: valid
num_bytes: 290707
num_examples: 1477
download_size: 247001
dataset_size: 660443.7224102912
---
# Dataset Card for "luftVersorgen-200-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mpiquero/Upscalers | ---
license: creativeml-openrail-m
---
|
open-llm-leaderboard/details_Gille__StrangeMerges_32-7B-slerp | ---
pretty_name: Evaluation run of Gille/StrangeMerges_32-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Gille/StrangeMerges_32-7B-slerp](https://huggingface.co/Gille/StrangeMerges_32-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_32-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-07T11:21:21.691903](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_32-7B-slerp/blob/main/results_2024-03-07T11-21-21.691903.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6518512233895413,\n\
\ \"acc_stderr\": 0.03205172778518449,\n \"acc_norm\": 0.6508163233940778,\n\
\ \"acc_norm_stderr\": 0.032728133415798846,\n \"mc1\": 0.627906976744186,\n\
\ \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7793840613265378,\n\
\ \"mc2_stderr\": 0.013680728445626752\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7133105802047781,\n \"acc_stderr\": 0.013214986329274776,\n\
\ \"acc_norm\": 0.7295221843003413,\n \"acc_norm_stderr\": 0.012980954547659556\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.714299940250946,\n\
\ \"acc_stderr\": 0.004508239594503832,\n \"acc_norm\": 0.8899621589324835,\n\
\ \"acc_norm_stderr\": 0.0031229736320394727\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886804,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886804\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931048,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931048\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.046695106638751906,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.046695106638751906\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993469,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993469\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4435754189944134,\n\
\ \"acc_stderr\": 0.01661568040100372,\n \"acc_norm\": 0.4435754189944134,\n\
\ \"acc_norm_stderr\": 0.01661568040100372\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.025917806117147158,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.025917806117147158\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n\
\ \"acc_stderr\": 0.012752858346533131,\n \"acc_norm\": 0.47392438070404175,\n\
\ \"acc_norm_stderr\": 0.012752858346533131\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.627906976744186,\n\
\ \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7793840613265378,\n\
\ \"mc2_stderr\": 0.013680728445626752\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8500394632991318,\n \"acc_stderr\": 0.010034394804580809\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7065959059893859,\n \
\ \"acc_stderr\": 0.01254183081546149\n }\n}\n```"
repo_url: https://huggingface.co/Gille/StrangeMerges_32-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|arc:challenge|25_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|gsm8k|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hellaswag|10_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T11-21-21.691903.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T11-21-21.691903.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- '**/details_harness|winogrande|5_2024-03-07T11-21-21.691903.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-07T11-21-21.691903.parquet'
- config_name: results
data_files:
- split: 2024_03_07T11_21_21.691903
path:
- results_2024-03-07T11-21-21.691903.parquet
- split: latest
path:
- results_2024-03-07T11-21-21.691903.parquet
---
# Dataset Card for Evaluation run of Gille/StrangeMerges_32-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_32-7B-slerp](https://huggingface.co/Gille/StrangeMerges_32-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_32-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-07T11:21:21.691903](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_32-7B-slerp/blob/main/results_2024-03-07T11-21-21.691903.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6518512233895413,
"acc_stderr": 0.03205172778518449,
"acc_norm": 0.6508163233940778,
"acc_norm_stderr": 0.032728133415798846,
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.7793840613265378,
"mc2_stderr": 0.013680728445626752
},
"harness|arc:challenge|25": {
"acc": 0.7133105802047781,
"acc_stderr": 0.013214986329274776,
"acc_norm": 0.7295221843003413,
"acc_norm_stderr": 0.012980954547659556
},
"harness|hellaswag|10": {
"acc": 0.714299940250946,
"acc_stderr": 0.004508239594503832,
"acc_norm": 0.8899621589324835,
"acc_norm_stderr": 0.0031229736320394727
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726854,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886804,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886804
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931048,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931048
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.046695106638751906,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.046695106638751906
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993469,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993469
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4435754189944134,
"acc_stderr": 0.01661568040100372,
"acc_norm": 0.4435754189944134,
"acc_norm_stderr": 0.01661568040100372
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.025917806117147158,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.025917806117147158
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.012752858346533131,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.012752858346533131
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.7793840613265378,
"mc2_stderr": 0.013680728445626752
},
"harness|winogrande|5": {
"acc": 0.8500394632991318,
"acc_stderr": 0.010034394804580809
},
"harness|gsm8k|5": {
"acc": 0.7065959059893859,
"acc_stderr": 0.01254183081546149
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
BG5/data | ---
license: mit
---
|
sammyfroly/ladyoscar | ---
license: openrail
---
|
akkasi/dutch_social | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: labels
sequence: float64
- name: label2idx
dtype: string
- name: idx2label
dtype: string
splits:
- name: train
num_bytes: 196538058
num_examples: 162805
- name: test
num_bytes: 65499632
num_examples: 54268
download_size: 24975837
dataset_size: 262037690
---
# Dataset Card for "dutch_social"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DJR1987/ikki | ---
license: openrail
---
|
Ozymandias314/MolCorefData | ---
license: mit
tags:
- chemistry
---
# MolDetect and MolCoref Data
The MolDetect and MolCoref models can be found in this [github repository](https://github.com/Ozymandias314/MolDetect), as well as additional instructions for testing or running the models.
The reaction diagrams are located at [`images.zip`](images.zip).
Additionally, we use a 70-10-20 split in our experiments. The full train/dev/test split for each task is available in this repository as well.
This [notebook](https://github.com/Ozymandias314/MolDetect/blob/main/notebook/visualize_data.ipynb) shows how to visualize the diagram and the ground truth. |
StofEzz/preprocessed_data_2200 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: input_values
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 478695272
num_examples: 2000
- name: test
num_bytes: 33556448
num_examples: 100
- name: validation
num_bytes: 30626216
num_examples: 100
download_size: 542536883
dataset_size: 542877936
---
# Dataset Card for "preprocessed_data_2200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thobauma/harmless-poisoned-0.04-chuela2502-murder | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 58402939.44335993
num_examples: 42537
download_size: 31364075
dataset_size: 58402939.44335993
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
alperiox/weapons_captions | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 603040.0
num_examples: 16
download_size: 605235
dataset_size: 603040.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Astris/sweeter_nectar | ---
language:
- en
dataset_info:
features:
- name: prompt
dtype: string
- name: good_natured
dtype: bool
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: chosen_model
dtype: string
- name: rejected_model
dtype: string
- name: chosen_rank
dtype: float64
- name: rejected_rank
dtype: float64
splits:
- name: train
num_bytes: 1163594641
num_examples: 502861
download_size: 454177624
dataset_size: 1163594641
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
Berkeley's Nectar Dataset, reformatted for DPO and edited to consider prompt denials as rejected generations.
If a response contained any of the following phrases, it was considered a prompt denial: ["I'm sorry, but", "I apologize, but", "not appropriate", "as an AI", "As an artificial intelligence", "OpenAI"]
If all of the responses for a given prompt were denials, the prompt was scrapped altogether.
If some were denials, the top responses were "chosen" and the denials were "rejected".
If none were denials, the higher ranked responses were put in "chosen", and the lower ranked responses were put in "rejected".
|
multilingual/orca_dpo_pairs | ---
dataset_info:
features:
- name: mllm_index
dtype: string
- name: system
dtype: string
- name: question
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: de_train
num_bytes: 38060434
num_examples: 11687
- name: ar_train
num_bytes: 14211631
num_examples: 3402
- name: zh_train
num_bytes: 29021389
num_examples: 11687
- name: es_train
num_bytes: 36064831
num_examples: 11687
- name: fr_train
num_bytes: 36580202
num_examples: 11104
- name: ru_train
num_bytes: 59694973
num_examples: 11687
- name: tr_train
num_bytes: 14211631
num_examples: 3402
download_size: 117157771
dataset_size: 227845091
configs:
- config_name: default
data_files:
- split: ar_train
path: data/ar_train-*
- split: zh_train
path: data/cn_train-*
- split: de_train
path: data/de_train-*
- split: es_train
path: data/es_train-*
- split: fr_train
path: data/fr_train-*
- split: ru_train
path: data/ru_train-*
- split: tr_train
path: data/tr_train-*
task_categories:
- text-generation
language:
- ar
- zh
- de
- fr
- es
- tr
- ru
tags:
- mllm
- multilingual
- rlhf
- dpo
license: apache-2.0
---
<div>
<img src="https://huggingface.co/datasets/multilingual/orca_dpo_pairs/resolve/main/orca_dpo_pairs_cover.png">
</div>
mLLM IMPLEMENTATION OF [Intel/orca_dpo_pairs](https://huggingface.co/datasets/Intel/orca_dpo_pairs).
LANGUAGES:
ARABIC
CHINESE
FRENCH
GERMAN
RUSSIAN
SPANISH
TURKISH
(WIP) |
imsoumyaneel/prompt-data | ---
license: mit
---
|
zolak/twitter_dataset_50_1713102953 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2888669
num_examples: 7016
download_size: 1460772
dataset_size: 2888669
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/honolulu_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of honolulu (Kantai Collection)
This is the dataset of honolulu (Kantai Collection), containing 219 images and their tags.
The core tags of this character are `blonde_hair, long_hair, breasts, blue_eyes, drill_hair, large_breasts, twintails, twin_drills, hair_ornament, hair_flower`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 219 | 238.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/honolulu_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 219 | 145.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/honolulu_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 559 | 329.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/honolulu_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 219 | 215.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/honolulu_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 559 | 451.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/honolulu_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/honolulu_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, simple_background, solo, string_bikini, white_bikini, cleavage, dated, looking_at_viewer, one-hour_drawing_challenge, side-tie_bikini_bottom, white_background, cowboy_shot, flower, open_mouth, twitter_username |
| 1 | 24 |  |  |  |  |  | 1girl, solo, white_bikini, side-tie_bikini_bottom, red_flower, cleavage, cowboy_shot, navel, string_bikini, looking_at_viewer, open_mouth, smile, hibiscus, blush, day, halterneck, official_alternate_costume, outdoors, collarbone, cloud, blue_sky, ocean |
| 2 | 9 |  |  |  |  |  | 1girl, breast_pocket, headgear, looking_at_viewer, military_uniform, red_ascot, sleeveless_jacket, solo, upper_body, one-hour_drawing_challenge, twitter_username, simple_background, white_background |
| 3 | 9 |  |  |  |  |  | 1girl, breast_pocket, red_ascot, simple_background, solo, headgear, sleeveless_jacket, white_background, cowboy_shot, dress, looking_at_viewer, smile, skirt, armpits |
| 4 | 7 |  |  |  |  |  | 1girl, solo, twitter_username, white_shirt, alternate_costume, blush, cleavage, one-hour_drawing_challenge, pleated_skirt, simple_background, white_background, collared_shirt, looking_at_viewer, school_uniform, smile, cowboy_shot, open_mouth, short_sleeves |
| 5 | 6 |  |  |  |  |  | detached_collar, playboy_bunny, rabbit_ears, cleavage, fake_animal_ears, looking_at_viewer, pantyhose, simple_background, strapless_leotard, white_background, wrist_cuffs, 1girl, cowboy_shot, rabbit_tail, solo, bowtie, smile |
| 6 | 5 |  |  |  |  |  | 1girl, blue_kimono, official_alternate_costume, ponytail, simple_background, solo, white_background, yukata, blush, eating, takoyaki, obi, full_body, holding_food, looking_at_viewer, mask_on_head, open_mouth, sandals, upper_body |
| 7 | 6 |  |  |  |  |  | 1girl, black_pantyhose, christmas, fur-trimmed_dress, red_dress, santa_costume, solo, cleavage, fur-trimmed_capelet, fur-trimmed_gloves, red_capelet, red_gloves, fake_mustache, alternate_costume, looking_at_viewer, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | simple_background | solo | string_bikini | white_bikini | cleavage | dated | looking_at_viewer | one-hour_drawing_challenge | side-tie_bikini_bottom | white_background | cowboy_shot | flower | open_mouth | twitter_username | red_flower | navel | smile | hibiscus | blush | day | halterneck | official_alternate_costume | outdoors | collarbone | cloud | blue_sky | ocean | breast_pocket | headgear | military_uniform | red_ascot | sleeveless_jacket | upper_body | dress | skirt | armpits | white_shirt | alternate_costume | pleated_skirt | collared_shirt | school_uniform | short_sleeves | detached_collar | playboy_bunny | rabbit_ears | fake_animal_ears | pantyhose | strapless_leotard | wrist_cuffs | rabbit_tail | bowtie | blue_kimono | ponytail | yukata | eating | takoyaki | obi | full_body | holding_food | mask_on_head | sandals | black_pantyhose | christmas | fur-trimmed_dress | red_dress | santa_costume | fur-trimmed_capelet | fur-trimmed_gloves | red_capelet | red_gloves | fake_mustache |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:----------------|:---------------|:-----------|:--------|:--------------------|:-----------------------------|:-------------------------|:-------------------|:--------------|:---------|:-------------|:-------------------|:-------------|:--------|:--------|:-----------|:--------|:------|:-------------|:-----------------------------|:-----------|:-------------|:--------|:-----------|:--------|:----------------|:-----------|:-------------------|:------------|:--------------------|:-------------|:--------|:--------|:----------|:--------------|:--------------------|:----------------|:-----------------|:-----------------|:----------------|:------------------|:----------------|:--------------|:-------------------|:------------|:--------------------|:--------------|:--------------|:---------|:--------------|:-----------|:---------|:---------|:-----------|:------|:------------|:---------------|:---------------|:----------|:------------------|:------------|:--------------------|:------------|:----------------|:----------------------|:---------------------|:--------------|:-------------|:----------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 24 |  |  |  |  |  | X | | X | X | X | X | | X | | X | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | X | | | | | X | X | | X | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | X | X | | | | | X | | | X | X | | | | | | X | | | | | | | | | | | X | X | | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | X | X | | | X | | X | X | | X | X | | X | X | | | X | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | X | | | X | | X | | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | X | | | | | X | | | X | | | X | | | | | | X | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | | X | | | X | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
eswardivi/Bollywood_songs | ---
language:
- en
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 5387340
num_examples: 999
download_size: 2942424
dataset_size: 5387340
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Bollywood_songs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
laptak2003/AmazonDataScience | ---
license: apache-2.0
---
|
Nexdata/Mandarin_Speech_Data_by_Mobile_Phone | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Mandarin_Speech_Data_by_Mobile_Phone
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/1081?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
4,787 Chinese native speakers participated in the recording with equal gender. Speakers are from various provinces of China. The recording content is rich, covering mobile phone voice assistant interaction, smart home command and control, In-car command and control, numbers, and other fields, which is accurately matching the smart home, intelligent car, and other practical application scenarios.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1081?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Chinese Mandarin
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
pengxiang01/test | ---
task_categories:
- tabular-to-text
- table-to-text
- multiple-choice
- text-retrieval
- time-series-forecasting
- visual-question-answering
- question-answering
- zero-shot-image-classification
- depth-estimation
language:
- ab
- ak
- ar
license: bsl-1.0
tags:
- biology
- code
- medical
pretty_name: sdfsad
size_categories:
- 10K<n<100K
---
aasdfsdf |
luzDP/thiagominosIA | ---
license: openrail
---
|
open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v5 | ---
pretty_name: Evaluation run of yeontaek/llama-2-70B-ensemble-v5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/llama-2-70B-ensemble-v5](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v5)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v5\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-02T15:51:19.541700](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v5/blob/main/results_2023-09-02T15%3A51%3A19.541700.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6953752417773453,\n\
\ \"acc_stderr\": 0.03133403952717257,\n \"acc_norm\": 0.6992145917201728,\n\
\ \"acc_norm_stderr\": 0.0313044221682843,\n \"mc1\": 0.4589963280293758,\n\
\ \"mc1_stderr\": 0.017444544447661192,\n \"mc2\": 0.6344801220097422,\n\
\ \"mc2_stderr\": 0.014915958195041953\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6774744027303754,\n \"acc_stderr\": 0.01365998089427737,\n\
\ \"acc_norm\": 0.71160409556314,\n \"acc_norm_stderr\": 0.013238394422428175\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6800438159729137,\n\
\ \"acc_stderr\": 0.004655059308602616,\n \"acc_norm\": 0.8724357697669787,\n\
\ \"acc_norm_stderr\": 0.0033292216060435208\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8092105263157895,\n \"acc_stderr\": 0.031975658210325,\n\
\ \"acc_norm\": 0.8092105263157895,\n \"acc_norm_stderr\": 0.031975658210325\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n\
\ \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7396226415094339,\n \"acc_stderr\": 0.027008766090708045,\n\
\ \"acc_norm\": 0.7396226415094339,\n \"acc_norm_stderr\": 0.027008766090708045\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6723404255319149,\n \"acc_stderr\": 0.030683020843231008,\n\
\ \"acc_norm\": 0.6723404255319149,\n \"acc_norm_stderr\": 0.030683020843231008\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47883597883597884,\n \"acc_stderr\": 0.025728230952130726,\n \"\
acc_norm\": 0.47883597883597884,\n \"acc_norm_stderr\": 0.025728230952130726\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n\
\ \"acc_stderr\": 0.02188617856717253,\n \"acc_norm\": 0.8193548387096774,\n\
\ \"acc_norm_stderr\": 0.02188617856717253\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n\
\ \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9040404040404041,\n \"acc_stderr\": 0.020984808610047933,\n \"\
acc_norm\": 0.9040404040404041,\n \"acc_norm_stderr\": 0.020984808610047933\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.018088393839078912,\n\
\ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.018088393839078912\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.717948717948718,\n \"acc_stderr\": 0.022815813098896597,\n \
\ \"acc_norm\": 0.717948717948718,\n \"acc_norm_stderr\": 0.022815813098896597\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7689075630252101,\n \"acc_stderr\": 0.027381406927868883,\n\
\ \"acc_norm\": 0.7689075630252101,\n \"acc_norm_stderr\": 0.027381406927868883\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.46357615894039733,\n \"acc_stderr\": 0.04071636065944215,\n \"\
acc_norm\": 0.46357615894039733,\n \"acc_norm_stderr\": 0.04071636065944215\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8880733944954129,\n \"acc_stderr\": 0.013517352714958792,\n \"\
acc_norm\": 0.8880733944954129,\n \"acc_norm_stderr\": 0.013517352714958792\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6018518518518519,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8970588235294118,\n \"acc_stderr\": 0.02132833757080438,\n \"\
acc_norm\": 0.8970588235294118,\n \"acc_norm_stderr\": 0.02132833757080438\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8776371308016878,\n \"acc_stderr\": 0.021331741829746786,\n \
\ \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.021331741829746786\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n\
\ \"acc_stderr\": 0.027790177064383595,\n \"acc_norm\": 0.7802690582959642,\n\
\ \"acc_norm_stderr\": 0.027790177064383595\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8320610687022901,\n \"acc_stderr\": 0.032785485373431386,\n\
\ \"acc_norm\": 0.8320610687022901,\n \"acc_norm_stderr\": 0.032785485373431386\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097655,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097655\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.02963471727237103,\n\
\ \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.02963471727237103\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8697318007662835,\n\
\ \"acc_stderr\": 0.012036729568216055,\n \"acc_norm\": 0.8697318007662835,\n\
\ \"acc_norm_stderr\": 0.012036729568216055\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7630057803468208,\n \"acc_stderr\": 0.02289408248992599,\n\
\ \"acc_norm\": 0.7630057803468208,\n \"acc_norm_stderr\": 0.02289408248992599\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6067039106145251,\n\
\ \"acc_stderr\": 0.016337268694270126,\n \"acc_norm\": 0.6067039106145251,\n\
\ \"acc_norm_stderr\": 0.016337268694270126\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7717041800643086,\n\
\ \"acc_stderr\": 0.0238393033113982,\n \"acc_norm\": 0.7717041800643086,\n\
\ \"acc_norm_stderr\": 0.0238393033113982\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.021613809395224802,\n\
\ \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.021613809395224802\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5567375886524822,\n \"acc_stderr\": 0.029634838473766006,\n \
\ \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.029634838473766006\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5638852672750978,\n\
\ \"acc_stderr\": 0.012665568135455321,\n \"acc_norm\": 0.5638852672750978,\n\
\ \"acc_norm_stderr\": 0.012665568135455321\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02679956202488766,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02679956202488766\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7401960784313726,\n \"acc_stderr\": 0.017740899509177795,\n \
\ \"acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.017740899509177795\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8122448979591836,\n \"acc_stderr\": 0.0250002560395462,\n\
\ \"acc_norm\": 0.8122448979591836,\n \"acc_norm_stderr\": 0.0250002560395462\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4589963280293758,\n\
\ \"mc1_stderr\": 0.017444544447661192,\n \"mc2\": 0.6344801220097422,\n\
\ \"mc2_stderr\": 0.014915958195041953\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/llama-2-70B-ensemble-v5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|arc:challenge|25_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hellaswag|10_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T15:51:19.541700.parquet'
- config_name: results
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- results_2023-09-02T15:51:19.541700.parquet
- split: latest
path:
- results_2023-09-02T15:51:19.541700.parquet
---
# Dataset Card for Evaluation run of yeontaek/llama-2-70B-ensemble-v5
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/llama-2-70B-ensemble-v5
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/llama-2-70B-ensemble-v5](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v5",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-02T15:51:19.541700](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v5/blob/main/results_2023-09-02T15%3A51%3A19.541700.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6953752417773453,
"acc_stderr": 0.03133403952717257,
"acc_norm": 0.6992145917201728,
"acc_norm_stderr": 0.0313044221682843,
"mc1": 0.4589963280293758,
"mc1_stderr": 0.017444544447661192,
"mc2": 0.6344801220097422,
"mc2_stderr": 0.014915958195041953
},
"harness|arc:challenge|25": {
"acc": 0.6774744027303754,
"acc_stderr": 0.01365998089427737,
"acc_norm": 0.71160409556314,
"acc_norm_stderr": 0.013238394422428175
},
"harness|hellaswag|10": {
"acc": 0.6800438159729137,
"acc_stderr": 0.004655059308602616,
"acc_norm": 0.8724357697669787,
"acc_norm_stderr": 0.0033292216060435208
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8092105263157895,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.8092105263157895,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7396226415094339,
"acc_stderr": 0.027008766090708045,
"acc_norm": 0.7396226415094339,
"acc_norm_stderr": 0.027008766090708045
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948617,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948617
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6723404255319149,
"acc_stderr": 0.030683020843231008,
"acc_norm": 0.6723404255319149,
"acc_norm_stderr": 0.030683020843231008
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594964,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594964
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47883597883597884,
"acc_stderr": 0.025728230952130726,
"acc_norm": 0.47883597883597884,
"acc_norm_stderr": 0.025728230952130726
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.02188617856717253,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.02188617856717253
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9040404040404041,
"acc_stderr": 0.020984808610047933,
"acc_norm": 0.9040404040404041,
"acc_norm_stderr": 0.020984808610047933
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.018088393839078912,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.018088393839078912
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.717948717948718,
"acc_stderr": 0.022815813098896597,
"acc_norm": 0.717948717948718,
"acc_norm_stderr": 0.022815813098896597
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7689075630252101,
"acc_stderr": 0.027381406927868883,
"acc_norm": 0.7689075630252101,
"acc_norm_stderr": 0.027381406927868883
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.46357615894039733,
"acc_stderr": 0.04071636065944215,
"acc_norm": 0.46357615894039733,
"acc_norm_stderr": 0.04071636065944215
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8880733944954129,
"acc_stderr": 0.013517352714958792,
"acc_norm": 0.8880733944954129,
"acc_norm_stderr": 0.013517352714958792
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8970588235294118,
"acc_stderr": 0.02132833757080438,
"acc_norm": 0.8970588235294118,
"acc_norm_stderr": 0.02132833757080438
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.021331741829746786,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.021331741829746786
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.027790177064383595,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.027790177064383595
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8320610687022901,
"acc_stderr": 0.032785485373431386,
"acc_norm": 0.8320610687022901,
"acc_norm_stderr": 0.032785485373431386
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097655,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097655
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8282208588957055,
"acc_stderr": 0.02963471727237103,
"acc_norm": 0.8282208588957055,
"acc_norm_stderr": 0.02963471727237103
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179326,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179326
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8697318007662835,
"acc_stderr": 0.012036729568216055,
"acc_norm": 0.8697318007662835,
"acc_norm_stderr": 0.012036729568216055
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.02289408248992599,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.02289408248992599
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6067039106145251,
"acc_stderr": 0.016337268694270126,
"acc_norm": 0.6067039106145251,
"acc_norm_stderr": 0.016337268694270126
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7717041800643086,
"acc_stderr": 0.0238393033113982,
"acc_norm": 0.7717041800643086,
"acc_norm_stderr": 0.0238393033113982
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.021613809395224802,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.021613809395224802
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.029634838473766006,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.029634838473766006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5638852672750978,
"acc_stderr": 0.012665568135455321,
"acc_norm": 0.5638852672750978,
"acc_norm_stderr": 0.012665568135455321
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02679956202488766,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02679956202488766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.017740899509177795,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.017740899509177795
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.0250002560395462,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.0250002560395462
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4589963280293758,
"mc1_stderr": 0.017444544447661192,
"mc2": 0.6344801220097422,
"mc2_stderr": 0.014915958195041953
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
poleval2019_mt | ---
annotations_creators:
- no-annotation
language_creators:
- expert-generated
- found
language:
- en
- pl
- ru
license:
- unknown
multilinguality:
- translation
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- translation
task_ids: []
paperswithcode_id: null
pretty_name: Poleval2019Mt
dataset_info:
- config_name: ru-pl
features:
- name: translation
dtype:
translation:
languages:
- ru
- pl
splits:
- name: train
num_bytes: 2818015
num_examples: 20001
- name: validation
num_bytes: 415735
num_examples: 3001
- name: test
num_bytes: 266462
num_examples: 2969
download_size: 3355801
dataset_size: 3500212
- config_name: en-pl
features:
- name: translation
dtype:
translation:
languages:
- en
- pl
splits:
- name: train
num_bytes: 13217798
num_examples: 129255
- name: validation
num_bytes: 1209168
num_examples: 10001
- name: test
num_bytes: 562482
num_examples: 9845
download_size: 13851405
dataset_size: 14989448
- config_name: pl-ru
features:
- name: translation
dtype:
translation:
languages:
- pl
- ru
splits:
- name: train
num_bytes: 2818015
num_examples: 20001
- name: validation
num_bytes: 415735
num_examples: 3001
- name: test
num_bytes: 149423
num_examples: 2967
download_size: 3355801
dataset_size: 3383173
- config_name: pl-en
features:
- name: translation
dtype:
translation:
languages:
- pl
- en
splits:
- name: train
num_bytes: 13217798
num_examples: 129255
- name: validation
num_bytes: 1209168
num_examples: 10001
- name: test
num_bytes: 16
num_examples: 1
download_size: 13591306
dataset_size: 14426982
---
# Dataset Card for poleval2019_mt
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** PolEval-2019 competition. http://2019.poleval.pl/
- **Repository:** Links available [in this page](http://2019.poleval.pl/index.php/tasks/task4)
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
PolEval is a SemEval-inspired evaluation campaign for natural language processing tools for Polish.
Submitted solutions compete against one another within certain tasks selected by organizers, using available data and are evaluated according to
pre-established procedures. One of the tasks in PolEval-2019 was Machine Translation (Task-4).
The task is to train as good as possible machine translation system, using any technology,with limited textual resources.
The competition will be done for 2 language pairs, more popular English-Polish (into Polish direction) and pair that can be called low resourced
Russian-Polish (in both directions).
Here, Polish-English is also made available to allow for training in both directions. However, the test data is ONLY available for English-Polish
### Supported Tasks and Leaderboards
Supports Machine Translation between Russian to Polish and English to Polish (and vice versa).
### Languages
- Polish (pl)
- Russian (ru)
- English (en)
## Dataset Structure
### Data Instances
As the training data set, a set of bi-lingual corpora aligned at the sentence level has been prepared. The corpora are saved in UTF-8 encoding as plain text, one language per file.
### Data Fields
One example of the translation is as below:
```
{
'translation': {'ru': 'не содержала в себе моделей. Модели это сравнительно новое явление. ',
'pl': 'nie miała w sobie modeli. Modele to względnie nowa dziedzina. Tak więc, jeśli '}
}
```
### Data Splits
The dataset is divided into two splits. All the headlines are scraped from news websites on the internet.
| | train | validation | test |
|-------|-------:|-----------:|-----:|
| ru-pl | 20001 | 3001 | 2969 |
| pl-ru | 20001 | 3001 | 2969 |
| en-pl | 129255 | 1000 | 9845 |
## Dataset Creation
### Curation Rationale
This data was curated as a task for the PolEval-2019. The task is to train as good as possible machine translation system, using any technology, with limited textual resources. The competition will be done for 2 language pairs, more popular English-Polish (into Polish direction) and pair that can be called low resourced Russian-Polish (in both directions).
PolEval is a SemEval-inspired evaluation campaign for natural language processing tools for Polish. Submitted tools compete against one another within certain tasks selected by organizers, using available data and are evaluated according to pre-established procedures.
PolEval 2019-related papers were presented at AI & NLP Workshop Day (Warsaw, May 31, 2019).
The links for the top performing models on various tasks (including the Task-4: Machine Translation) is present in [this](http://2019.poleval.pl/index.php/publication) link
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
The organization details of PolEval is present in this [link](http://2019.poleval.pl/index.php/organizers)
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@proceedings{ogr:kob:19:poleval,
editor = {Maciej Ogrodniczuk and Łukasz Kobyliński},
title = {{Proceedings of the PolEval 2019 Workshop}},
year = {2019},
address = {Warsaw, Poland},
publisher = {Institute of Computer Science, Polish Academy of Sciences},
url = {http://2019.poleval.pl/files/poleval2019.pdf},
isbn = "978-83-63159-28-3"}
}
```
### Contributions
Thanks to [@vrindaprabhu](https://github.com/vrindaprabhu) for adding this dataset. |
open-llm-leaderboard/details_shadowml__DareBeagle-7B | ---
pretty_name: Evaluation run of shadowml/DareBeagle-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [shadowml/DareBeagle-7B](https://huggingface.co/shadowml/DareBeagle-7B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_shadowml__DareBeagle-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-17T01:23:21.187556](https://huggingface.co/datasets/open-llm-leaderboard/details_shadowml__DareBeagle-7B/blob/main/results_2024-01-17T01-23-21.187556.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6558006222318064,\n\
\ \"acc_stderr\": 0.03207222987379561,\n \"acc_norm\": 0.6553061048140534,\n\
\ \"acc_norm_stderr\": 0.03273955111816457,\n \"mc1\": 0.5532435740514076,\n\
\ \"mc1_stderr\": 0.017403977522557148,\n \"mc2\": 0.6897520361652546,\n\
\ \"mc2_stderr\": 0.014904414829813977\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6928327645051194,\n \"acc_stderr\": 0.013481034054980945,\n\
\ \"acc_norm\": 0.7167235494880546,\n \"acc_norm_stderr\": 0.013167478735134575\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7066321449910377,\n\
\ \"acc_stderr\": 0.004543750480065778,\n \"acc_norm\": 0.880103565026887,\n\
\ \"acc_norm_stderr\": 0.003241765092912133\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7320754716981132,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.7320754716981132,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n\
\ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n\
\ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.032469569197899575,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.032469569197899575\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113115,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113115\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903341,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903341\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4245810055865922,\n\
\ \"acc_stderr\": 0.016531170993278884,\n \"acc_norm\": 0.4245810055865922,\n\
\ \"acc_norm_stderr\": 0.016531170993278884\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.025917806117147158,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.025917806117147158\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n\
\ \"acc_stderr\": 0.01274197433389723,\n \"acc_norm\": 0.4667535853976532,\n\
\ \"acc_norm_stderr\": 0.01274197433389723\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146293,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146293\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174937,\n\
\ \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174937\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5532435740514076,\n\
\ \"mc1_stderr\": 0.017403977522557148,\n \"mc2\": 0.6897520361652546,\n\
\ \"mc2_stderr\": 0.014904414829813977\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8232044198895028,\n \"acc_stderr\": 0.010721923287918753\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7149355572403336,\n \
\ \"acc_stderr\": 0.01243504233490401\n }\n}\n```"
repo_url: https://huggingface.co/shadowml/DareBeagle-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|arc:challenge|25_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|gsm8k|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hellaswag|10_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T01-23-21.187556.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-17T01-23-21.187556.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- '**/details_harness|winogrande|5_2024-01-17T01-23-21.187556.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-17T01-23-21.187556.parquet'
- config_name: results
data_files:
- split: 2024_01_17T01_23_21.187556
path:
- results_2024-01-17T01-23-21.187556.parquet
- split: latest
path:
- results_2024-01-17T01-23-21.187556.parquet
---
# Dataset Card for Evaluation run of shadowml/DareBeagle-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [shadowml/DareBeagle-7B](https://huggingface.co/shadowml/DareBeagle-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_shadowml__DareBeagle-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-17T01:23:21.187556](https://huggingface.co/datasets/open-llm-leaderboard/details_shadowml__DareBeagle-7B/blob/main/results_2024-01-17T01-23-21.187556.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6558006222318064,
"acc_stderr": 0.03207222987379561,
"acc_norm": 0.6553061048140534,
"acc_norm_stderr": 0.03273955111816457,
"mc1": 0.5532435740514076,
"mc1_stderr": 0.017403977522557148,
"mc2": 0.6897520361652546,
"mc2_stderr": 0.014904414829813977
},
"harness|arc:challenge|25": {
"acc": 0.6928327645051194,
"acc_stderr": 0.013481034054980945,
"acc_norm": 0.7167235494880546,
"acc_norm_stderr": 0.013167478735134575
},
"harness|hellaswag|10": {
"acc": 0.7066321449910377,
"acc_stderr": 0.004543750480065778,
"acc_norm": 0.880103565026887,
"acc_norm_stderr": 0.003241765092912133
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7320754716981132,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.7320754716981132,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.032469569197899575,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.032469569197899575
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097112,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097112
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113115,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113115
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128137,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903341,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903341
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4245810055865922,
"acc_stderr": 0.016531170993278884,
"acc_norm": 0.4245810055865922,
"acc_norm_stderr": 0.016531170993278884
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.025917806117147158,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.025917806117147158
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.01274197433389723,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.01274197433389723
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146293,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146293
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507208,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507208
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174937,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174937
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5532435740514076,
"mc1_stderr": 0.017403977522557148,
"mc2": 0.6897520361652546,
"mc2_stderr": 0.014904414829813977
},
"harness|winogrande|5": {
"acc": 0.8232044198895028,
"acc_stderr": 0.010721923287918753
},
"harness|gsm8k|5": {
"acc": 0.7149355572403336,
"acc_stderr": 0.01243504233490401
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Weni/wenigpt-agent-1.2.0-positive | ---
dataset_info:
features:
- name: title
dtype: string
- name: link
dtype: string
- name: content
dtype: string
- name: content_base_uuid
dtype: string
- name: base_link_uuid
dtype: string
- name: adjective
dtype: string
- name: name
dtype: string
- name: occupation
dtype: string
- name: chatbot_goal
dtype: string
- name: instructions
sequence: string
- name: question
dtype: string
- name: answer
dtype: string
- name: human_eval
dtype: string
- name: id
dtype: int64
- name: chunks_small
list:
- name: content
dtype: string
- name: score
dtype: float64
- name: chunks_big
list:
- name: content
dtype: string
- name: score
dtype: float64
- name: groundedness
dtype: float64
- name: correct_ans
dtype: int64
- name: greetings
dtype: int64
- name: context_size_classification
dtype: int64
- name: emoji
dtype: int64
- name: groundedness-gpt4
dtype: float64
splits:
- name: train
num_bytes: 6678659
num_examples: 361
- name: teste
num_bytes: 939169
num_examples: 41
download_size: 2508521
dataset_size: 7617828
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: teste
path: data/teste-*
---
|
jlbaker361/actstu-dream | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
- name: seed
dtype: int64
- name: steps
dtype: int64
splits:
- name: train
num_bytes: 30770281.0
num_examples: 28
download_size: 30772754
dataset_size: 30770281.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_0.0_seed_1_t_1.0 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43723341
num_examples: 18928
- name: epoch_1
num_bytes: 44325440
num_examples: 18928
- name: epoch_2
num_bytes: 44405123
num_examples: 18928
- name: epoch_3
num_bytes: 44444051
num_examples: 18928
- name: epoch_4
num_bytes: 44458482
num_examples: 18928
- name: epoch_5
num_bytes: 44455366
num_examples: 18928
- name: epoch_6
num_bytes: 44447631
num_examples: 18928
- name: epoch_7
num_bytes: 44442729
num_examples: 18928
- name: epoch_8
num_bytes: 44438685
num_examples: 18928
- name: epoch_9
num_bytes: 44437437
num_examples: 18928
- name: epoch_10
num_bytes: 44436630
num_examples: 18928
- name: epoch_11
num_bytes: 44434315
num_examples: 18928
- name: epoch_12
num_bytes: 44433641
num_examples: 18928
- name: epoch_13
num_bytes: 44435463
num_examples: 18928
- name: epoch_14
num_bytes: 44434899
num_examples: 18928
- name: epoch_15
num_bytes: 44434832
num_examples: 18928
- name: epoch_16
num_bytes: 44437080
num_examples: 18928
- name: epoch_17
num_bytes: 44434587
num_examples: 18928
- name: epoch_18
num_bytes: 44436021
num_examples: 18928
- name: epoch_19
num_bytes: 44435166
num_examples: 18928
- name: epoch_20
num_bytes: 44436584
num_examples: 18928
- name: epoch_21
num_bytes: 44436641
num_examples: 18928
- name: epoch_22
num_bytes: 44435671
num_examples: 18928
- name: epoch_23
num_bytes: 44436033
num_examples: 18928
- name: epoch_24
num_bytes: 44437932
num_examples: 18928
- name: epoch_25
num_bytes: 44436019
num_examples: 18928
- name: epoch_26
num_bytes: 44437586
num_examples: 18928
- name: epoch_27
num_bytes: 44437085
num_examples: 18928
- name: epoch_28
num_bytes: 44438176
num_examples: 18928
- name: epoch_29
num_bytes: 44437904
num_examples: 18928
download_size: 701043820
dataset_size: 1332300550
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
joey234/rt_non_affix | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': neg
'1': pos
splits:
- name: test
num_bytes: 122830.3789868668
num_examples: 963
download_size: 79719
dataset_size: 122830.3789868668
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "rt_non_affix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_186 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 641495252.0
num_examples: 125981
download_size: 647860754
dataset_size: 641495252.0
---
# Dataset Card for "chunk_186"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shahbajsingh/cs482-housing | ---
dataset_info:
features:
- name: '0'
dtype: float64
- name: '1'
dtype: float64
- name: '2'
dtype: float64
- name: '3'
dtype: float64
- name: '4'
dtype: float64
- name: '5'
dtype: float64
- name: '6'
dtype: float64
- name: '7'
dtype: float64
splits:
- name: train
num_bytes: 1046016
num_examples: 16344
download_size: 814920
dataset_size: 1046016
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HighCWu/fill50k | ---
license: openrail
dataset_info:
features:
- name: image
dtype: image
- name: guide
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 454411979
num_examples: 50000
download_size: 316021131
dataset_size: 454411979
language:
- en
pretty_name: a
---
# Dataset Card for Fill50K
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset is converted from fill50k example dataset of [ControlNet](https://github.com/lllyasviel/ControlNet)
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[fill50k.zip](https://huggingface.co/lllyasviel/ControlNet/blob/main/training/fill50k.zip)
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_saarvajanik__facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache | ---
pretty_name: Evaluation run of saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache](https://huggingface.co/saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_saarvajanik__facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-28T18:43:29.335129](https://huggingface.co/datasets/open-llm-leaderboard/details_saarvajanik__facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache/blob/main/results_2024-01-28T18-43-29.335129.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2316811015320164,\n\
\ \"acc_stderr\": 0.029921145723277393,\n \"acc_norm\": 0.2319513129670488,\n\
\ \"acc_norm_stderr\": 0.030716547854869283,\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.014843061507731606,\n \"mc2\": 0.4668597319189143,\n\
\ \"mc2_stderr\": 0.016229462983418045\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.18515358361774745,\n \"acc_stderr\": 0.011350774438389699,\n\
\ \"acc_norm\": 0.2380546075085324,\n \"acc_norm_stderr\": 0.0124457700280262\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.26249751045608444,\n\
\ \"acc_stderr\": 0.004390923353200559,\n \"acc_norm\": 0.2704640509858594,\n\
\ \"acc_norm_stderr\": 0.004432917403755056\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2350061199510404,\n \"mc1_stderr\": 0.014843061507731606,\n\
\ \"mc2\": 0.4668597319189143,\n \"mc2_stderr\": 0.016229462983418045\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.5082872928176796,\n\
\ \"acc_stderr\": 0.014050555322824189\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|arc:challenge|25_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|gsm8k|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hellaswag|10_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T18-43-29.335129.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T18-43-29.335129.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- '**/details_harness|winogrande|5_2024-01-28T18-43-29.335129.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-28T18-43-29.335129.parquet'
- config_name: results
data_files:
- split: 2024_01_28T18_43_29.335129
path:
- results_2024-01-28T18-43-29.335129.parquet
- split: latest
path:
- results_2024-01-28T18-43-29.335129.parquet
---
# Dataset Card for Evaluation run of saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache](https://huggingface.co/saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_saarvajanik__facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-28T18:43:29.335129](https://huggingface.co/datasets/open-llm-leaderboard/details_saarvajanik__facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache/blob/main/results_2024-01-28T18-43-29.335129.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2316811015320164,
"acc_stderr": 0.029921145723277393,
"acc_norm": 0.2319513129670488,
"acc_norm_stderr": 0.030716547854869283,
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731606,
"mc2": 0.4668597319189143,
"mc2_stderr": 0.016229462983418045
},
"harness|arc:challenge|25": {
"acc": 0.18515358361774745,
"acc_stderr": 0.011350774438389699,
"acc_norm": 0.2380546075085324,
"acc_norm_stderr": 0.0124457700280262
},
"harness|hellaswag|10": {
"acc": 0.26249751045608444,
"acc_stderr": 0.004390923353200559,
"acc_norm": 0.2704640509858594,
"acc_norm_stderr": 0.004432917403755056
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731606,
"mc2": 0.4668597319189143,
"mc2_stderr": 0.016229462983418045
},
"harness|winogrande|5": {
"acc": 0.5082872928176796,
"acc_stderr": 0.014050555322824189
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
kaist-ai/Preference-Collection | ---
language:
- en
size_categories:
- 100K<n<1M
task_categories:
- text-generation
dataset_info:
features:
- name: orig_criteria
dtype: string
- name: orig_feedback_A
dtype: string
- name: orig_feedback_B
dtype: string
- name: orig_instruction
dtype: string
- name: orig_reference_answer
dtype: string
- name: orig_response_A
dtype: string
- name: orig_response_B
dtype: string
- name: orig_score_A
dtype: string
- name: orig_score_B
dtype: string
- name: orig_preference
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: input
dtype: string
- name: orig_feedback
dtype: string
splits:
- name: train
num_bytes: 2925408348
num_examples: 199760
download_size: 703919707
dataset_size: 2925408348
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
adsabs/WIESP2022-NER | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: 'WIESP2022-NER'
size_categories:
- 1K<n<10K
source_datasets: []
task_categories:
- token-classification
task_ids:
- named-entity-recognition
---
# Dataset for the first <a href="https://ui.adsabs.harvard.edu/WIESP/" style="color:blue">Workshop on Information Extraction from Scientific Publications (WIESP/2022)</a>.
## Dataset Description
Datasets with text fragments from astrophysics papers, provided by the [NASA Astrophysical Data System](https://ui.adsabs.harvard.edu/) with manually tagged astronomical facilities and other entities of interest (e.g., celestial objects).
Datasets are in JSON Lines format (each line is a json dictionary).
The datasets are formatted similarly to the CONLL2003 format. Each token is associated with an NER tag. The tags follow the "B-" and "I-" convention from the [IOB2 syntax]("https://en.wikipedia.org/wiki/Inside%E2%80%93outside%E2%80%93beginning_(tagging)")
Each entry consists of a dictionary with the following keys:
- `"unique_id"`: a unique identifier for this data sample. Must be included in the predictions.
- `"tokens"`: the list of tokens (strings) that form the text of this sample. Must be included in the predictions.
- `"ner_tags"`: the list of NER tags (in IOB2 format)
The following keys are not strictly needed by the participants:
- `"ner_ids"`: the pre-computed list of ids corresponding ner_tags, as given by the dictionary in ner_tags.json
- `"label_studio_id"`, `"section"`, `"bibcode"`: references for internal NASA/ADS use.
## Instructions for Workshop participants:
How to load the data using the Huggingface library:
```python
from datasets import load_dataset
dataset = load_dataset("adsabs/WIESP2022-NER")
```
How to load the data if you cloned the repository locally:
(assuming `./WIESP2022-NER-DEV.jsonl` is in the current directory, change as needed)
- python (as list of dictionaries):
```python
import json
with open("./WIESP2022-NER-DEV.jsonl", 'r') as f:
wiesp_dev_json = [json.loads(l) for l in list(f)]
```
- into Huggingface (as a Huggingface Dataset):
```python
from datasets import Dataset
wiesp_dev_from_json = Dataset.from_json(path_or_paths="./WIESP2022-NER-DEV.jsonl")
```
How to compute your scores on the training data:
1. format your predictions as a list of dictionaries, each with the same `"unique_id"` and `"tokens"` keys from the dataset, as well as the list of predicted NER tags under the `"pred_ner_tags"` key (see `WIESP2022-NER-DEV-sample-predictions.jsonl` for an example).
2. pass the references and predictions datasets to the `compute_MCC()` and `compute_seqeval()` functions (from the `.py` files with the same names).
Requirement to run the scoring scripts:
[NumPy](https://numpy.org/install/)
[scikit-learn](https://scikit-learn.org/stable/install.html)
[seqeval](https://github.com/chakki-works/seqeval#installation)
To get scores on the validation data, zip your predictions file (a single `.jsonl' file formatted following the same instructions as above) and upload the `.zip` file to the [Codalabs](https://codalab.lisn.upsaclay.fr/competitions/5062) competition.
## File list
```
├── WIESP2022-NER-TRAINING.jsonl : 1753 samples for training.
├── WIESP2022-NER-DEV.jsonl : 20 samples for development.
├── WIESP2022-NER-DEV-sample-predictions.jsonl : an example file with properly formatted predictions on the development data.
├── WIESP2022-NER-VALIDATION-NO-LABELS.jsonl : 1366 samples for validation without the NER labels. Used for the WIESP2022 workshop.
├── WIESP2022-NER-VALIDATION.jsonl : 1366 samples for validation
├── WIESP2022-NER-TESTING-NO-LABELS.jsonl : 2505 samples for testing without the NER labels. Used for the WIESP2022 workshop.
├── WIESP2022-NER-TESTING.jsonl : 2505 samples for testing
├── README.MD : this file.
├── tag_definitions.md : short descriptions and examples of the tags used in the task.
└── scoring-scripts/ : scripts used to evaluate submissions.
├── compute_MCC.py : computes the Matthews correlation coefficient between two datasets.
└── compute_seqeval.py : computes the seqeval scores (precision, recall, f1, overall and for each class) between two datasets.
```
## Cite as
[Overview of the First Shared Task on Detecting Entities in the Astrophysics Literature (DEAL)](https://aclanthology.org/2022.wiesp-1.1) (Grezes et al., WIESP 2022)
```python
@inproceedings{grezes-etal-2022-overview,
title = "Overview of the First Shared Task on Detecting Entities in the Astrophysics Literature ({DEAL})",
author = "Grezes, Felix and
Blanco-Cuaresma, Sergi and
Allen, Thomas and
Ghosal, Tirthankar",
booktitle = "Proceedings of the first Workshop on Information Extraction from Scientific Publications",
month = "nov",
year = "2022",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.wiesp-1.1",
pages = "1--7",
abstract = "In this article, we describe the overview of our shared task: Detecting Entities in the Astrophysics Literature (DEAL). The DEAL shared task was part of the Workshop on Information Extraction from Scientific Publications (WIESP) in AACL-IJCNLP 2022. Information extraction from scientific publications is critical in several downstream tasks such as identification of critical entities, article summarization, citation classification, etc. The motivation of this shared task was to develop a community-wide effort for entity extraction from astrophysics literature. Automated entity extraction would help to build knowledge bases, high-quality meta-data for indexing and search, and several other use-cases of interests. Thirty-three teams registered for DEAL, twelve of them participated in the system runs, and finally four teams submitted their system descriptions. We analyze their system and performance and finally discuss the findings of DEAL.",
}
``` |
HydraLM/partitioned_v2_standardized_09 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_conversation_id
dtype: string
splits:
- name: train
num_bytes: 37044613.11862594
num_examples: 72472
download_size: 8455589
dataset_size: 37044613.11862594
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "partitioned_v2_standardized_09"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
holen/Finite_element_crash_data | ---
license: apache-2.0
---
The data contains three different vehicles from CCSA (https://www.ccsa.gmu.edu/models/):
A Toyota Yaris
A Chevy Silverado
And an ADS vehicle
These vehicles were tested at different speeds, and the binout files were stored.
The car models were used to develop an AI that could estimate a full frontal impact for different cars at different speeds.
This can then be used to predict the force of an impact for an Autonomous car simulator. |
Lollitor/MyPubChem2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 295081.2
num_examples: 1800
- name: validation
num_bytes: 32786.8
num_examples: 200
download_size: 103924
dataset_size: 327868.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "MyPubChem2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bigscience/massive-probing-results | ---
license: apache-2.0
---
|
crumb/flan-t5-large-embed-refinedweb | ---
license: apache-2.0
language:
- en
task_categories:
- feature-extraction
tags:
- t5
- flan
size_categories:
- 100K<n<1M
---
All of the data together is around 81.3GB. It's the last hidden states of 131,072 samples from refinedweb padded/truncated to 512 tokens on the left, fed through [google/flan-t5-base](https://hf.co/google/flan-t5-base).
Structure:
```
{
"encoding": List, shaped (512, 1024) aka (tokens, d_model),
"text": String, the original text that was encoded,
"attention_mask": List, binary mask to pass to your model with encoding to not attend to pad tokens
}
``` |
BangumiBase/higurashinonakukoroni | ---
license: mit
tags:
- art
size_categories:
- 10K<n<100K
---
# Bangumi Image Base of Higurashi No Naku Koro Ni
This is the image base of bangumi Higurashi no Naku Koro Ni, we detected 71 characters, 12274 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 18 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 306 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 29 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 38 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 17 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 16 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 30 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 1686 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 412 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 77 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 32 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 124 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 135 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 103 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 36 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 717 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 125 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 389 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 98 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 63 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 141 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 31 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 126 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 9 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 38 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 260 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 52 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 919 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 27 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 19 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 29 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 20 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 56 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 17 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 32 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 34 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 20 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 26 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 10 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 128 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 1451 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 84 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 37 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 19 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 18 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 95 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 1392 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 75 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 20 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 419 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 15 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 94 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 1639 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 36 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 35 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 10 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 14 | [Download](56/dataset.zip) |  |  |  |  |  |  |  |  |
| 57 | 17 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| 58 | 16 | [Download](58/dataset.zip) |  |  |  |  |  |  |  |  |
| 59 | 7 | [Download](59/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 60 | 9 | [Download](60/dataset.zip) |  |  |  |  |  |  |  |  |
| 61 | 8 | [Download](61/dataset.zip) |  |  |  |  |  |  |  |  |
| 62 | 8 | [Download](62/dataset.zip) |  |  |  |  |  |  |  |  |
| 63 | 8 | [Download](63/dataset.zip) |  |  |  |  |  |  |  |  |
| 64 | 12 | [Download](64/dataset.zip) |  |  |  |  |  |  |  |  |
| 65 | 9 | [Download](65/dataset.zip) |  |  |  |  |  |  |  |  |
| 66 | 8 | [Download](66/dataset.zip) |  |  |  |  |  |  |  |  |
| 67 | 23 | [Download](67/dataset.zip) |  |  |  |  |  |  |  |  |
| 68 | 12 | [Download](68/dataset.zip) |  |  |  |  |  |  |  |  |
| 69 | 5 | [Download](69/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| noise | 234 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
distilled-from-one-sec-cv12/chunk_254 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 745002176
num_examples: 145168
download_size: 753562749
dataset_size: 745002176
---
# Dataset Card for "chunk_254"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
christinacdl/Offensive_Hateful_Dataset_New | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
--- |
Multimodal-Fatima/FGVC_Aircraft_test_facebook_opt_2.7b_Visclues_ns_3333 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_0_bs_16
num_bytes: 299564852.375
num_examples: 3333
- name: fewshot_1_bs_16
num_bytes: 300685275.375
num_examples: 3333
- name: fewshot_3_bs_16
num_bytes: 302937771.375
num_examples: 3333
download_size: 892471687
dataset_size: 903187899.125
---
# Dataset Card for "FGVC_Aircraft_test_facebook_opt_2.7b_Visclues_ns_3333"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/COCO_captions_test | ---
dataset_info:
features:
- name: image
dtype: image
- name: filepath
dtype: string
- name: sentids
list: int32
- name: filename
dtype: string
- name: imgid
dtype: int32
- name: split
dtype: string
- name: sentences_tokens
list:
list: string
- name: sentences_raw
list: string
- name: sentences_sentid
list: int32
- name: cocoid
dtype: int32
- name: id
dtype: int64
- name: clip_tags_ViT_L_14
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B
sequence: string
- name: blip_caption_beam_5
dtype: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14
sequence: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_LAION-ViT-H-14-2B
sequence: string
- name: DETA_detections_deta_swin_large_o365_coco_classes
list:
- name: attribute
dtype: string
- name: box
sequence: float32
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float32
- name: size
dtype: string
- name: tag
dtype: string
splits:
- name: test
num_bytes: 831189492.0
num_examples: 5000
download_size: 823516792
dataset_size: 831189492.0
---
# Dataset Card for "COCO_captions_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aryanmehta5902/doctest1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 58694263
num_examples: 2524
download_size: 15613485
dataset_size: 58694263
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
NovelQA/NovelQA | ---
license: apache-2.0
task_categories:
- question-answering
size_categories:
- 10K<n<100K
---
---
license: apache-2.0
task_categories:
- question-answering
size_categories:
- 10K<n<100K
---
# Dataset Card for NovelQA
<!-- Provide a quick summary of the dataset. -->
NovelQA is a benchmark for testing the long-text ability of LLMs.
## Dataset Details
### Dataset Description
- **Language:** English
### Dataset Sources
- **Repository:** https://github.com/NovelQA/novelqa.github.io
- **Leaderboard:** https://novelqa.github.io
- **Competition:**: https://www.codabench.org/competitions/2295/
- **Paper:** https://arxiv.org/abs/2403.12766
<!-- - **Demo [optional]:** -->
## Uses
### Directly downloading
You can directly download the NovelQA.zip, which contains all files of Raw_Novels/, Data/ and Demonstration/.
### Through API
You might also use this dataset through the Huggingface `dataset` package as follows.
```python
from datasets import load_dataset
dataset = load_dataset("NovelQA/NovelQA", data_files = {
"book": "books/*.txt",
"ques": "ques/*.json"
}, streaming=True)
books = dataset["book"]
ques = dataset["ques"]
```
## Dataset Structure
The dataset is structured as follows.
```
- NovelQA.zip // This zip file contains all of the novels, data and demonstration
- NovelQA
| - book // the book contents of the novels
| | - booktitle1.txt
| | - booktitle2.txt
| | - ...
|
| - ques // the corresponding QA-pairs of each book
| - booktitle1.json
| - booktitle2.json
| - ...
```
Among the json files, each file includes a list of dicts, each of which is structured as follows.
```json
{
"Question": "The input question",
"Options": [
"Option A",
"Option B",
"Option C",
"Option D"
],
"Complex": "A complexity level among mh, sh, and dtl",
"Aspect": "An aspect from times, meaning, span, settg, relat, character, and plot"
},
```
## Citation
**BibTeX:**
```bibtex
@misc{wang2024novelqa,
title={NovelQA: A Benchmark for Long-Range Novel Question Answering},
author={Cunxiang Wang and Ruoxi Ning and Boqi Pan and Tonghui Wu and Qipeng Guo and Cheng Deng and Guangsheng Bao and Qian Wang and Yue Zhang},
year={2024},
eprint={2403.12766},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## Term
Your participation and submission to this benchmark will naturally give your consent to the following terms.
The input data are only for internal evaluation use.
Please do not publicly spread the input data online.
The competition hosts are not responsible for any possible violation of novel copyright caused by the participants' spreading the input data publicly online.
## Contact
If you find problems downloading or using this dataset, please contact the first authors from the Arxiv paper to get access to the dataset. |
joey234/mmlu-public_relations-verbal-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 47692
num_examples: 110
download_size: 32701
dataset_size: 47692
---
# Dataset Card for "mmlu-public_relations-verbal-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_upstage__SOLAR-0-70b-16bit | ---
pretty_name: Evaluation run of upstage/SOLAR-0-70b-16bit
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [upstage/SOLAR-0-70b-16bit](https://huggingface.co/upstage/SOLAR-0-70b-16bit)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_upstage__SOLAR-0-70b-16bit_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-07T01:00:47.965413](https://huggingface.co/datasets/open-llm-leaderboard/details_upstage__SOLAR-0-70b-16bit_public/blob/main/results_2023-11-07T01-00-47.965413.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3555998322147651,\n\
\ \"em_stderr\": 0.004902281518260701,\n \"f1\": 0.47494337248322493,\n\
\ \"f1_stderr\": 0.004563199491248503,\n \"acc\": 0.6442241467520119,\n\
\ \"acc_stderr\": 0.012060674423078888\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.3555998322147651,\n \"em_stderr\": 0.004902281518260701,\n\
\ \"f1\": 0.47494337248322493,\n \"f1_stderr\": 0.004563199491248503\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.45261561789234267,\n \
\ \"acc_stderr\": 0.013710499070934969\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.010410849775222808\n\
\ }\n}\n```"
repo_url: https://huggingface.co/upstage/SOLAR-0-70b-16bit
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_11_07T01_00_47.965413
path:
- '**/details_harness|drop|3_2023-11-07T01-00-47.965413.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-07T01-00-47.965413.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_07T01_00_47.965413
path:
- '**/details_harness|gsm8k|5_2023-11-07T01-00-47.965413.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-07T01-00-47.965413.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_07T01_00_47.965413
path:
- '**/details_harness|winogrande|5_2023-11-07T01-00-47.965413.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-07T01-00-47.965413.parquet'
- config_name: results
data_files:
- split: 2023_11_07T01_00_47.965413
path:
- results_2023-11-07T01-00-47.965413.parquet
- split: latest
path:
- results_2023-11-07T01-00-47.965413.parquet
---
# Dataset Card for Evaluation run of upstage/SOLAR-0-70b-16bit
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/upstage/SOLAR-0-70b-16bit
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [upstage/SOLAR-0-70b-16bit](https://huggingface.co/upstage/SOLAR-0-70b-16bit) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_upstage__SOLAR-0-70b-16bit_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-07T01:00:47.965413](https://huggingface.co/datasets/open-llm-leaderboard/details_upstage__SOLAR-0-70b-16bit_public/blob/main/results_2023-11-07T01-00-47.965413.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3555998322147651,
"em_stderr": 0.004902281518260701,
"f1": 0.47494337248322493,
"f1_stderr": 0.004563199491248503,
"acc": 0.6442241467520119,
"acc_stderr": 0.012060674423078888
},
"harness|drop|3": {
"em": 0.3555998322147651,
"em_stderr": 0.004902281518260701,
"f1": 0.47494337248322493,
"f1_stderr": 0.004563199491248503
},
"harness|gsm8k|5": {
"acc": 0.45261561789234267,
"acc_stderr": 0.013710499070934969
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.010410849775222808
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Emiliedeyk/baixamemoria | ---
license: openrail
---
|
open-llm-leaderboard/details_senseable__Wilbur-30B | ---
pretty_name: Evaluation run of senseable/Wilbur-30B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [senseable/Wilbur-30B](https://huggingface.co/senseable/Wilbur-30B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_senseable__Wilbur-30B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-27T07:45:34.703302](https://huggingface.co/datasets/open-llm-leaderboard/details_senseable__Wilbur-30B/blob/main/results_2024-01-27T07-45-34.703302.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7650338898352297,\n\
\ \"acc_stderr\": 0.028248683874528373,\n \"acc_norm\": 0.7682008360158653,\n\
\ \"acc_norm_stderr\": 0.028793309090233483,\n \"mc1\": 0.5263157894736842,\n\
\ \"mc1_stderr\": 0.017479241161975457,\n \"mc2\": 0.6996159108788989,\n\
\ \"mc2_stderr\": 0.014237498534320117\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7218430034129693,\n \"acc_stderr\": 0.0130944699195388,\n\
\ \"acc_norm\": 0.7406143344709898,\n \"acc_norm_stderr\": 0.012808273573927094\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6719776936865166,\n\
\ \"acc_stderr\": 0.004685334844038661,\n \"acc_norm\": 0.866759609639514,\n\
\ \"acc_norm_stderr\": 0.003391398293613441\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7481481481481481,\n\
\ \"acc_stderr\": 0.03749850709174021,\n \"acc_norm\": 0.7481481481481481,\n\
\ \"acc_norm_stderr\": 0.03749850709174021\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.026293995855474945,\n\
\ \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.026293995855474945\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n\
\ \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8113207547169812,\n \"acc_stderr\": 0.024079995130062253,\n\
\ \"acc_norm\": 0.8113207547169812,\n \"acc_norm_stderr\": 0.024079995130062253\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9097222222222222,\n\
\ \"acc_stderr\": 0.023964965777906935,\n \"acc_norm\": 0.9097222222222222,\n\
\ \"acc_norm_stderr\": 0.023964965777906935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7225433526011561,\n\
\ \"acc_stderr\": 0.03414014007044036,\n \"acc_norm\": 0.7225433526011561,\n\
\ \"acc_norm_stderr\": 0.03414014007044036\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.04959859966384181,\n\
\ \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.04959859966384181\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7617021276595745,\n \"acc_stderr\": 0.027851252973889774,\n\
\ \"acc_norm\": 0.7617021276595745,\n \"acc_norm_stderr\": 0.027851252973889774\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5701754385964912,\n\
\ \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.5701754385964912,\n\
\ \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7586206896551724,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.7586206896551724,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.7407407407407407,\n \"acc_stderr\": 0.022569897074918424,\n \"\
acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.022569897074918424\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n\
\ \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n\
\ \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9032258064516129,\n\
\ \"acc_stderr\": 0.016818943416345197,\n \"acc_norm\": 0.9032258064516129,\n\
\ \"acc_norm_stderr\": 0.016818943416345197\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6847290640394089,\n \"acc_stderr\": 0.03269080871970186,\n\
\ \"acc_norm\": 0.6847290640394089,\n \"acc_norm_stderr\": 0.03269080871970186\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\"\
: 0.77,\n \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8727272727272727,\n \"acc_stderr\": 0.026024657651656187,\n\
\ \"acc_norm\": 0.8727272727272727,\n \"acc_norm_stderr\": 0.026024657651656187\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9292929292929293,\n \"acc_stderr\": 0.018263105420199505,\n \"\
acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.018263105420199505\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.011464523356953162,\n\
\ \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.011464523356953162\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8102564102564103,\n \"acc_stderr\": 0.0198801654065888,\n \
\ \"acc_norm\": 0.8102564102564103,\n \"acc_norm_stderr\": 0.0198801654065888\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.45925925925925926,\n \"acc_stderr\": 0.03038416923235083,\n \
\ \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.03038416923235083\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8487394957983193,\n \"acc_stderr\": 0.023274255898707952,\n\
\ \"acc_norm\": 0.8487394957983193,\n \"acc_norm_stderr\": 0.023274255898707952\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5165562913907285,\n \"acc_stderr\": 0.04080244185628972,\n \"\
acc_norm\": 0.5165562913907285,\n \"acc_norm_stderr\": 0.04080244185628972\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9247706422018349,\n \"acc_stderr\": 0.011308662537571727,\n \"\
acc_norm\": 0.9247706422018349,\n \"acc_norm_stderr\": 0.011308662537571727\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6527777777777778,\n \"acc_stderr\": 0.032468872436376486,\n \"\
acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.032468872436376486\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9156118143459916,\n \"acc_stderr\": 0.01809424711647333,\n \
\ \"acc_norm\": 0.9156118143459916,\n \"acc_norm_stderr\": 0.01809424711647333\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.820627802690583,\n\
\ \"acc_stderr\": 0.0257498195691928,\n \"acc_norm\": 0.820627802690583,\n\
\ \"acc_norm_stderr\": 0.0257498195691928\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n\
\ \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540627,\n \"\
acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540627\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n\
\ \"acc_stderr\": 0.02923927267563275,\n \"acc_norm\": 0.8981481481481481,\n\
\ \"acc_norm_stderr\": 0.02923927267563275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8711656441717791,\n \"acc_stderr\": 0.026321383198783674,\n\
\ \"acc_norm\": 0.8711656441717791,\n \"acc_norm_stderr\": 0.026321383198783674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n\
\ \"acc_stderr\": 0.04726835553719098,\n \"acc_norm\": 0.5446428571428571,\n\
\ \"acc_norm_stderr\": 0.04726835553719098\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n\
\ \"acc_stderr\": 0.015537514263253858,\n \"acc_norm\": 0.9401709401709402,\n\
\ \"acc_norm_stderr\": 0.015537514263253858\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9144316730523627,\n\
\ \"acc_stderr\": 0.010002965568647286,\n \"acc_norm\": 0.9144316730523627,\n\
\ \"acc_norm_stderr\": 0.010002965568647286\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8294797687861272,\n \"acc_stderr\": 0.020247961569303728,\n\
\ \"acc_norm\": 0.8294797687861272,\n \"acc_norm_stderr\": 0.020247961569303728\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7988826815642458,\n\
\ \"acc_stderr\": 0.013405946402609047,\n \"acc_norm\": 0.7988826815642458,\n\
\ \"acc_norm_stderr\": 0.013405946402609047\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8594771241830066,\n \"acc_stderr\": 0.019899435463539946,\n\
\ \"acc_norm\": 0.8594771241830066,\n \"acc_norm_stderr\": 0.019899435463539946\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7942122186495176,\n\
\ \"acc_stderr\": 0.022961339906764244,\n \"acc_norm\": 0.7942122186495176,\n\
\ \"acc_norm_stderr\": 0.022961339906764244\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8703703703703703,\n \"acc_stderr\": 0.018689725721062075,\n\
\ \"acc_norm\": 0.8703703703703703,\n \"acc_norm_stderr\": 0.018689725721062075\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6312056737588653,\n \"acc_stderr\": 0.028782227561347254,\n \
\ \"acc_norm\": 0.6312056737588653,\n \"acc_norm_stderr\": 0.028782227561347254\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5912646675358539,\n\
\ \"acc_stderr\": 0.012555701346703382,\n \"acc_norm\": 0.5912646675358539,\n\
\ \"acc_norm_stderr\": 0.012555701346703382\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8308823529411765,\n \"acc_stderr\": 0.022770868010113014,\n\
\ \"acc_norm\": 0.8308823529411765,\n \"acc_norm_stderr\": 0.022770868010113014\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8218954248366013,\n \"acc_stderr\": 0.01547836965310857,\n \
\ \"acc_norm\": 0.8218954248366013,\n \"acc_norm_stderr\": 0.01547836965310857\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8530612244897959,\n \"acc_stderr\": 0.02266540041721764,\n\
\ \"acc_norm\": 0.8530612244897959,\n \"acc_norm_stderr\": 0.02266540041721764\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9104477611940298,\n\
\ \"acc_stderr\": 0.02019067053502792,\n \"acc_norm\": 0.9104477611940298,\n\
\ \"acc_norm_stderr\": 0.02019067053502792\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5903614457831325,\n\
\ \"acc_stderr\": 0.03828401115079021,\n \"acc_norm\": 0.5903614457831325,\n\
\ \"acc_norm_stderr\": 0.03828401115079021\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276908,\n\
\ \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276908\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5263157894736842,\n\
\ \"mc1_stderr\": 0.017479241161975457,\n \"mc2\": 0.6996159108788989,\n\
\ \"mc2_stderr\": 0.014237498534320117\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8342541436464088,\n \"acc_stderr\": 0.010450899545370623\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7225170583775588,\n \
\ \"acc_stderr\": 0.01233344758104754\n }\n}\n```"
repo_url: https://huggingface.co/senseable/Wilbur-30B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|arc:challenge|25_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|gsm8k|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hellaswag|10_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T07-45-34.703302.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T07-45-34.703302.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- '**/details_harness|winogrande|5_2024-01-27T07-45-34.703302.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-27T07-45-34.703302.parquet'
- config_name: results
data_files:
- split: 2024_01_27T07_45_34.703302
path:
- results_2024-01-27T07-45-34.703302.parquet
- split: latest
path:
- results_2024-01-27T07-45-34.703302.parquet
---
# Dataset Card for Evaluation run of senseable/Wilbur-30B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [senseable/Wilbur-30B](https://huggingface.co/senseable/Wilbur-30B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_senseable__Wilbur-30B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-27T07:45:34.703302](https://huggingface.co/datasets/open-llm-leaderboard/details_senseable__Wilbur-30B/blob/main/results_2024-01-27T07-45-34.703302.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7650338898352297,
"acc_stderr": 0.028248683874528373,
"acc_norm": 0.7682008360158653,
"acc_norm_stderr": 0.028793309090233483,
"mc1": 0.5263157894736842,
"mc1_stderr": 0.017479241161975457,
"mc2": 0.6996159108788989,
"mc2_stderr": 0.014237498534320117
},
"harness|arc:challenge|25": {
"acc": 0.7218430034129693,
"acc_stderr": 0.0130944699195388,
"acc_norm": 0.7406143344709898,
"acc_norm_stderr": 0.012808273573927094
},
"harness|hellaswag|10": {
"acc": 0.6719776936865166,
"acc_stderr": 0.004685334844038661,
"acc_norm": 0.866759609639514,
"acc_norm_stderr": 0.003391398293613441
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7481481481481481,
"acc_stderr": 0.03749850709174021,
"acc_norm": 0.7481481481481481,
"acc_norm_stderr": 0.03749850709174021
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.026293995855474945,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.026293995855474945
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8113207547169812,
"acc_stderr": 0.024079995130062253,
"acc_norm": 0.8113207547169812,
"acc_norm_stderr": 0.024079995130062253
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9097222222222222,
"acc_stderr": 0.023964965777906935,
"acc_norm": 0.9097222222222222,
"acc_norm_stderr": 0.023964965777906935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.03414014007044036,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.03414014007044036
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.04959859966384181,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.04959859966384181
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7617021276595745,
"acc_stderr": 0.027851252973889774,
"acc_norm": 0.7617021276595745,
"acc_norm_stderr": 0.027851252973889774
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5701754385964912,
"acc_stderr": 0.04657047260594964,
"acc_norm": 0.5701754385964912,
"acc_norm_stderr": 0.04657047260594964
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7586206896551724,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.7586206896551724,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.022569897074918424,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.022569897074918424
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9032258064516129,
"acc_stderr": 0.016818943416345197,
"acc_norm": 0.9032258064516129,
"acc_norm_stderr": 0.016818943416345197
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6847290640394089,
"acc_stderr": 0.03269080871970186,
"acc_norm": 0.6847290640394089,
"acc_norm_stderr": 0.03269080871970186
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8727272727272727,
"acc_stderr": 0.026024657651656187,
"acc_norm": 0.8727272727272727,
"acc_norm_stderr": 0.026024657651656187
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.018263105420199505,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.018263105420199505
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.011464523356953162,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.011464523356953162
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8102564102564103,
"acc_stderr": 0.0198801654065888,
"acc_norm": 0.8102564102564103,
"acc_norm_stderr": 0.0198801654065888
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.03038416923235083,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.03038416923235083
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8487394957983193,
"acc_stderr": 0.023274255898707952,
"acc_norm": 0.8487394957983193,
"acc_norm_stderr": 0.023274255898707952
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5165562913907285,
"acc_stderr": 0.04080244185628972,
"acc_norm": 0.5165562913907285,
"acc_norm_stderr": 0.04080244185628972
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9247706422018349,
"acc_stderr": 0.011308662537571727,
"acc_norm": 0.9247706422018349,
"acc_norm_stderr": 0.011308662537571727
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9156118143459916,
"acc_stderr": 0.01809424711647333,
"acc_norm": 0.9156118143459916,
"acc_norm_stderr": 0.01809424711647333
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.820627802690583,
"acc_stderr": 0.0257498195691928,
"acc_norm": 0.820627802690583,
"acc_norm_stderr": 0.0257498195691928
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540627,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540627
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.02923927267563275,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.02923927267563275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8711656441717791,
"acc_stderr": 0.026321383198783674,
"acc_norm": 0.8711656441717791,
"acc_norm_stderr": 0.026321383198783674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719098,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719098
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253858,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253858
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9144316730523627,
"acc_stderr": 0.010002965568647286,
"acc_norm": 0.9144316730523627,
"acc_norm_stderr": 0.010002965568647286
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8294797687861272,
"acc_stderr": 0.020247961569303728,
"acc_norm": 0.8294797687861272,
"acc_norm_stderr": 0.020247961569303728
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7988826815642458,
"acc_stderr": 0.013405946402609047,
"acc_norm": 0.7988826815642458,
"acc_norm_stderr": 0.013405946402609047
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8594771241830066,
"acc_stderr": 0.019899435463539946,
"acc_norm": 0.8594771241830066,
"acc_norm_stderr": 0.019899435463539946
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7942122186495176,
"acc_stderr": 0.022961339906764244,
"acc_norm": 0.7942122186495176,
"acc_norm_stderr": 0.022961339906764244
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.018689725721062075,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.018689725721062075
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6312056737588653,
"acc_stderr": 0.028782227561347254,
"acc_norm": 0.6312056737588653,
"acc_norm_stderr": 0.028782227561347254
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5912646675358539,
"acc_stderr": 0.012555701346703382,
"acc_norm": 0.5912646675358539,
"acc_norm_stderr": 0.012555701346703382
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8308823529411765,
"acc_stderr": 0.022770868010113014,
"acc_norm": 0.8308823529411765,
"acc_norm_stderr": 0.022770868010113014
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8218954248366013,
"acc_stderr": 0.01547836965310857,
"acc_norm": 0.8218954248366013,
"acc_norm_stderr": 0.01547836965310857
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8530612244897959,
"acc_stderr": 0.02266540041721764,
"acc_norm": 0.8530612244897959,
"acc_norm_stderr": 0.02266540041721764
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9104477611940298,
"acc_stderr": 0.02019067053502792,
"acc_norm": 0.9104477611940298,
"acc_norm_stderr": 0.02019067053502792
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5903614457831325,
"acc_stderr": 0.03828401115079021,
"acc_norm": 0.5903614457831325,
"acc_norm_stderr": 0.03828401115079021
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276908,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276908
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5263157894736842,
"mc1_stderr": 0.017479241161975457,
"mc2": 0.6996159108788989,
"mc2_stderr": 0.014237498534320117
},
"harness|winogrande|5": {
"acc": 0.8342541436464088,
"acc_stderr": 0.010450899545370623
},
"harness|gsm8k|5": {
"acc": 0.7225170583775588,
"acc_stderr": 0.01233344758104754
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
joey234/mmlu-high_school_computer_science-verbal-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 61583
num_examples: 100
download_size: 37712
dataset_size: 61583
---
# Dataset Card for "mmlu-high_school_computer_science-verbal-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bsaylan/adr_tank_1 | ---
license: apache-2.0
---
|
KentoTsu/KENTOVOICE | ---
license: openrail
---
|
Erynan/4_ethics_4 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 12153310
num_examples: 13629
download_size: 2257904
dataset_size: 12153310
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
katarinayuan/ProtST-GeneOntology-MF | ---
configs:
- config_name: default
data_files:
- split: train
path: "gene_ontology_mf_train.csv"
- split: validation
path: "gene_ontology_mf_valid.csv"
- split: test
path: "gene_ontology_mf_test.csv"
--- |
shreevigneshs/iwslt-2023-en-vi-train-val-split-0.1 | ---
dataset_info:
features:
- name: en
dtype: string
- name: vi
dtype: string
- name: vi_annotated
dtype: string
- name: styles
dtype: int64
splits:
- name: train
num_bytes: 326525.0
num_examples: 720
- name: val
num_bytes: 36694.0
num_examples: 80
- name: if_test
num_bytes: 275045.0
num_examples: 598
- name: f_test
num_bytes: 294897.0
num_examples: 598
- name: f_flores
num_bytes: 337966
num_examples: 1012
- name: if_flores
num_bytes: 337966
num_examples: 1012
download_size: 409674
dataset_size: 1609093.0
---
# Dataset Card for "iwslt-2023-en-vi-train-val-split-0.1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
maghwa/OpenHermes-2-AR-10K-4 | ---
dataset_info:
features:
- name: avatarUrl
dtype: 'null'
- name: conversations
dtype: string
- name: source
dtype: string
- name: language
dtype: 'null'
- name: idx
dtype: 'null'
- name: model_name
dtype: 'null'
- name: id
dtype: string
- name: category
dtype: 'null'
- name: title
dtype: 'null'
- name: model
dtype: 'null'
- name: topic
dtype: 'null'
- name: skip_prompt_formatting
dtype: 'null'
- name: system_prompt
dtype: 'null'
- name: views
dtype: float64
- name: custom_instruction
dtype: 'null'
- name: hash
dtype: 'null'
splits:
- name: train
num_bytes: 21544752
num_examples: 10001
download_size: 7825310
dataset_size: 21544752
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
INSAIT-Institute/winogrande-bgeval | ---
language:
- bg
dataset_info:
features:
- name: sentence
dtype: string
- name: option1
dtype: string
- name: option2
dtype: string
- name: answer
dtype: string
splits:
- name: validation
num_bytes: 289382
num_examples: 1267
download_size: 121356
dataset_size: 289382
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
joey234/mmlu-high_school_government_and_politics-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 63813
num_examples: 193
download_size: 39170
dataset_size: 63813
---
# Dataset Card for "mmlu-high_school_government_and_politics-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sinhala-nlp/SemiSOLD | ---
language:
- si
---
# SOLD - A Benchmark for Sinhala Offensive Language Identification
In this repository, we introduce the {S}inhala {O}ffensive {L}anguage {D}ataset **(SOLD)** and present multiple experiments on this dataset. **SOLD** is a manually annotated dataset containing 10,000 posts from Twitter annotated as offensive and not offensive at both sentence-level and token-level. **SOLD** is the largest offensive language dataset compiled for Sinhala. We also introduce **SemiSOLD**, a larger dataset containing more than 145,000 Sinhala tweets, annotated following a semi-supervised approach.
:warning: This repository contains texts that may be offensive and harmful.
## Annotation
We use an annotation scheme split into two levels deciding (a) Offensiveness of a tweet (sentence-level) and (b) Tokens that contribute to the offence at sentence-level (token-level).
### Sentence-level
Our sentence-level offensive language detection follows level A in OLID [(Zampieri et al., 2019)](https://aclanthology.org/N19-1144/). We asked annotators to discriminate between the following types of tweets:
* **Offensive (OFF)**: Posts containing any form of non-acceptable language (profanity) or a targeted offence, which can be veiled or direct. This includes insults, threats, and posts containing profane language or swear words.
* **Not Offensive (NOT)**: Posts that do not contain offense or profanity.
Each tweet was annotated with one of the above labels, which we used as the labels in sentence-level offensive language identification.
### Token-level
To provide a human explanation of labelling, we collect rationales for the offensive language. Following HateXplain [(Mathew et al., 2021)](https://ojs.aaai.org/index.php/AAAI/article/view/17745), we define a rationale as a specific text segment that justifies the human annotator’s decision of the sentence-level labels. Therefore, We ask the annotators to highlight particular tokens in a tweet that supports their judgement about the sentence-level label (offensive, not offensive). Specifically, if a tweet is offensive, we guide the annotators to highlight tokens from the text that supports the judgement while including non-verbal expressions such as emojis and morphemes that are used to convey the intention as well. We use this as token-level offensive labels in SOLD.

## Data
SOLD is released in HuggingFace. It can be loaded in to pandas dataframes using the following code.
```python
from datasets import Dataset
from datasets import load_dataset
sold_train = Dataset.to_pandas(load_dataset('sinhala-nlp/SOLD', split='train'))
sold_test = Dataset.to_pandas(load_dataset('sinhala-nlp/SOLD', split='test'))
```
The dataset contains of the following columns.
* **post_id** - Twitter ID
* **text** - Post text
* **tokens** - Tokenised text. Each token is seperated by a space.
* **rationals** - Offensive tokens. If a token is offensive it is shown as 1 and 0 otherwise.
* **label** - Sentence-level label, offensive or not-offensive.

SemiSOLD is also released HuggingFace and can be loaded to a pandas dataframe using the following code.
```python
from datasets import Dataset
from datasets import load_dataset
semi_sold = Dataset.to_pandas(load_dataset('sinhala-nlp/SemiSOLD', split='train'))
```
The dataset contains following columns
* **post_id** - Twitter ID
* **text** - Post text
Furthermore it contains predicted offensiveness scores from nine classifiers trained on SOLD train; xlmr, xlmt, mbert, sinbert, lstm_ft, cnn_ft, lstm_cbow, cnn_cbow, lstm_sl, cnn_sl and svm
## Experiments
Clone the repository and install the libraries using the following command (preferably inside a conda environment)
~~~
pip install -r requirements.txt
~~~
### Sentence-level
Sentence-level transformer based experiments can be executed using the following command.
~~~
python -m experiments.sentence_level.sinhala_deepoffense
~~~
The command takes the following arguments;
~~~
--model_type : Type of the transformer model (bert, xlmroberta, roberta etc ).
--model_name : The exact architecture and trained weights to use. This may be a Hugging Face Transformers compatible pre-trained model, a community model, or the path to a directory containing model files.
--transfer : Whether to perform transfer learning or not (true or false).
--transfer_language : The initial language if transfer learning is performed (hi, en or si).
* hi - Perform transfer learning from HASOC 2019 Hindi dataset (Modha et al., 2019).
* en - Perform transfer learning from Offenseval English dataset (Zampieri et al., 2019).
* si - Perform transfer learning from CCMS Sinhala dataset (Rathnayake et al., 2021).
--augment : Perform semi supervised data augmentation.
--std : Standard deviation of the models to cut down data augmentation.
--augment_type: The type of the data augmentation.
* off - Augment only the offensive instances.
* normal - Augment both offensive and non-offensive instances.
~~~
Sentence-level CNN and LSTM based experiments can be executed using the following command.
~~~
python -m experiments.sentence_level.sinhala_offensive_nn
~~~
The command takes the following arguments;
~~~
--model_type : Type of the architecture (cnn2D, lstm).
--model_name : The exact word embeddings to use. This may be a gensim model, or the path to a word embeddinng files.
--augment : Perform semi supervised data augmentation.
--std : Standard deviation of the models to cut down data augmentation.
--augment_type: The type of the data augmentation.
* off - Augment only the offensive instances.
* normal - Augment both offensive and non-offensive instances.
~~~
### Token-level
Token-level transformer based experiments can be executed using the following command.
~~~
python -m experiments.sentence_level.sinhala_mudes
~~~
The command takes the following arguments;
~~~
--model_type : Type of the transformer model (bert, xlmroberta, roberta etc ).
--model_name : The exact architecture and trained weights to use. This may be a Hugging Face Transformers compatible pre-trained model, a community model, or the path to a directory containing model files.
--transfer : Whether to perform transfer learning or not (true or false).
--transfer_language : The initial language if transfer learning is performed (hatex or tsd).
* hatex - Perform transfer learning from HateXplain dataset (Mathew et al., 2021).
* tsd - Perform transfer learning from TSD dataset (Pavlopoulos et al., 2021).
~~~
Token-level LIME experiments can be executed using the following command.
~~~
python -m experiments.sentence_level.sinhala_lime
~~~
The command takes the following arguments;
~~~
--model_type : Type of the transformer model (bert, xlmroberta, roberta etc ).
--model_name : The exact architecture and trained weights to use. This may be a Hugging Face Transformers compatible pre-trained model, a community model, or the path to a directory containing model files.
~~~
## Acknowledgments
We want to acknowledge Janitha Hapuarachchi, Sachith Suraweera, Chandika Udaya Kumara and Ridmi Randima, the team of volunteer annotators that provided their free time and efforts to help us produce SOLD.
## Citation
If you are using the dataset or the models please cite the following paper
~~~
@article{ranasinghe2022sold,
title={SOLD: Sinhala Offensive Language Dataset},
author={Ranasinghe, Tharindu and Anuradha, Isuri and Premasiri, Damith and Silva, Kanishka and Hettiarachchi, Hansi and Uyangodage, Lasitha and Zampieri, Marcos},
journal={arXiv preprint arXiv:2212.00851},
year={2022}
}
~~~ |
AdapterOcean/physics_dataset_standardized_embedded | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float32
splits:
- name: train
num_bytes: 131673144
num_examples: 19999
download_size: 62942340
dataset_size: 131673144
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "physics_dataset_standardized_embedded"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mito0o852/MATH_1GRADE | ---
dataset_info:
features:
- name: formula
dtype: string
- name: result
dtype: int64
splits:
- name: train
num_bytes: 31352169
num_examples: 1000000
download_size: 22114377
dataset_size: 31352169
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
tags:
- MATH
pretty_name: MATH 1st Grade
size_categories:
- 1M<n<10M
---
# Dataset Card for "MATH_1GRADE"
# 1st Grade Math Problems Dataset
This dataset, available on Hugging Face, offers a unique collection of math problems tailored for first-grade students. The problems have been synthetically generated using Python scripts and are designed to challenge and enhance the mathematical skills of young learners in an engaging and accessible way. This README provides an overview of the dataset, including its structure, contents, and how to use it for educational purposes or machine learning tasks.
## Dataset Description
- **Domain** Education/Mathematics
- **Grade Level** 1st Grade
- **Contents** The dataset consists of simple arithmetic problems suitable for first graders, involving basic operations such as addition, subtraction, and understanding of negative numbers.
- **Data Format** CSV/JSON
## Data Structure
The dataset is structured as follows:
- `formula`: A string representing the math problem. It includes numbers and operations (addition, subtraction) formatted as text. For example, "140 + 515 - -441 - -34 + 155".
- `result`: The answer to the math problem, represented as a floating-point number for operations resulting in non-integer values, and an integer for whole numbers.
### Example Entry
```json
{
"formula": "940 + 515 - -441 - -34 + 155",
"result": 2085
}
```
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_79_1713152695 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 184829
num_examples: 457
download_size: 96316
dataset_size: 184829
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ThraggBilly/flickr30k_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 4178820473.876
num_examples: 31783
download_size: 4402850196
dataset_size: 4178820473.876
---
# Dataset Card for "test_dataset3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lifarias/Rasha | ---
license: openrail
---
|
bdsaglam/musique-answerable-2hop-subset-erx-reward-2023-12-30T19-33-03 | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: reward
dtype: int64
splits:
- name: train
num_bytes: 1306097
num_examples: 900
download_size: 89215
dataset_size: 1306097
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Astral-P/MinakoAino | ---
license: wtfpl
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.