datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
tarasfrompir/Ivrit.ai-based | ---
task_categories:
- automatic-speech-recognition
language:
- he
---
License - https://www.ivrit.ai/en/the-license/
Based on data
https://huggingface.co/ivrit-ai
All conditions for using this data are described - https://www.ivrit.ai/
All data that was used is here - https://huggingface.co/ivrit-ai
For all questions please contact here - https://huggingface.co/ivrit-ai
This dataset is posted for demonstration purposes. Any illegal actions associated with this dataset are unacceptable.
All data may contain errors entries as this is a demo version.
88 hours |
CyberHarem/shining_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shining/シャイニング/闪灵 (Arknights)
This is the dataset of shining/シャイニング/闪灵 (Arknights), containing 444 images and their tags.
The core tags of this character are `long_hair, horns, breasts, hair_between_eyes, pointy_ears, very_long_hair, white_hair, large_breasts, ponytail, brown_eyes, red_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 444 | 809.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shining_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 444 | 675.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shining_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1116 | 1.30 GiB | [Download](https://huggingface.co/datasets/CyberHarem/shining_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shining_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, holding_staff, looking_at_viewer, necklace, solo, black_cloak, black_dress, cowboy_shot, grey_background, hood_up, hooded_cloak, simple_background, closed_mouth, gradient_background, weapon, long_sleeves, white_background |
| 1 | 11 |  |  |  |  |  | 1girl, black_cloak, solo, hooded_cloak, parted_lips, holding_weapon, looking_at_viewer, sword, hood_up, simple_background, upper_body, white_background, grey_hair, necklace, staff, black_dress |
| 2 | 9 |  |  |  |  |  | 1girl, black_cloak, hood_up, solo, upper_body, closed_mouth, hooded_cloak, looking_at_viewer, grey_hair, necklace |
| 3 | 6 |  |  |  |  |  | 1girl, black_cloak, black_dress, black_footwear, holding_sword, solo, full_body, knee_boots, high_heel_boots, torn_clothes, unsheathing, standing, thigh_strap |
| 4 | 13 |  |  |  |  |  | 1girl, bare_shoulders, official_alternate_costume, solo, looking_at_viewer, upper_body, black_one-piece_swimsuit, simple_background, closed_mouth, sidelocks, white_background, blush, grey_hair, detached_sleeves, smile |
| 5 | 10 |  |  |  |  |  | 1girl, bare_shoulders, black_one-piece_swimsuit, looking_at_viewer, official_alternate_costume, solo, detached_sleeves, holding_staff, thigh_strap, thighs, infection_monitor_(arknights), cowboy_shot, closed_mouth, parted_lips, smile, blush, sitting |
| 6 | 5 |  |  |  |  |  | 1girl, bare_shoulders, black_one-piece_swimsuit, looking_at_viewer, official_alternate_costume, parted_lips, sandals, solo, thigh_strap, thighs, water, detached_sleeves, hair_spread_out, bare_legs, blonde_hair, full_body, yokozuwari, absurdly_long_hair, bird, holding_staff, outdoors, wariza |
| 7 | 6 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, closed_mouth, looking_at_viewer, solo, official_alternate_costume, upper_body, horn_ornament, long_sleeves, smile, grey_hair, ribbon, simple_background, white_background |
| 8 | 9 |  |  |  |  |  | 1girl, looking_at_viewer, completely_nude, nipples, solo, navel, collarbone, sidelocks, thighs, blush, parted_lips, simple_background, sitting, smile |
| 9 | 12 |  |  |  |  |  | 1girl, hetero, 1boy, penis, solo_focus, blush, nipples, sweat, upper_body, completely_nude, paizuri, sidelocks, breasts_squeezed_together, fellatio, from_side, open_mouth, bar_censor, ejaculation |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | holding_staff | looking_at_viewer | necklace | solo | black_cloak | black_dress | cowboy_shot | grey_background | hood_up | hooded_cloak | simple_background | closed_mouth | gradient_background | weapon | long_sleeves | white_background | parted_lips | holding_weapon | sword | upper_body | grey_hair | staff | black_footwear | holding_sword | full_body | knee_boots | high_heel_boots | torn_clothes | unsheathing | standing | thigh_strap | bare_shoulders | official_alternate_costume | black_one-piece_swimsuit | sidelocks | blush | detached_sleeves | smile | thighs | infection_monitor_(arknights) | sitting | sandals | water | hair_spread_out | bare_legs | blonde_hair | yokozuwari | absurdly_long_hair | bird | outdoors | wariza | horn_ornament | ribbon | completely_nude | nipples | navel | collarbone | hetero | 1boy | penis | solo_focus | sweat | paizuri | breasts_squeezed_together | fellatio | from_side | open_mouth | bar_censor | ejaculation |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------|:--------------------|:-----------|:-------|:--------------|:--------------|:--------------|:------------------|:----------|:---------------|:--------------------|:---------------|:----------------------|:---------|:---------------|:-------------------|:--------------|:-----------------|:--------|:-------------|:------------|:--------|:-----------------|:----------------|:------------|:-------------|:------------------|:---------------|:--------------|:-----------|:--------------|:-----------------|:-----------------------------|:---------------------------|:------------|:--------|:-------------------|:--------|:---------|:--------------------------------|:----------|:----------|:--------|:------------------|:------------|:--------------|:-------------|:---------------------|:-------|:-----------|:---------|:----------------|:---------|:------------------|:----------|:--------|:-------------|:---------|:-------|:--------|:-------------|:--------|:----------|:----------------------------|:-----------|:------------|:-------------|:-------------|:--------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | | X | X | X | X | X | | | X | X | X | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | | X | X | X | X | | | | X | X | | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | | | X | X | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 13 |  |  |  |  |  | X | | X | | X | | | | | | | X | X | | | | X | | | | X | X | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 10 |  |  |  |  |  | X | X | X | | X | | | X | | | | | X | | | | | X | | | | | | | | | | | | | | X | X | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | X | | X | | | | | | | | | | | | | X | | | | | | | | X | | | | | | X | X | X | X | | | X | | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | | X | | X | | X | | | | | X | X | | | X | X | | | | X | X | | | | | | | | | | | X | X | | | | | X | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | |
| 8 | 9 |  |  |  |  |  | X | | X | | X | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | X | X | | X | X | | X | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | |
| 9 | 12 |  |  |  |  |  | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
fredfang/RH20T | ---
license: other
---
|
yjernite/prof_report__plasmo-vox2__multi__24 | ---
dataset_info:
features:
- name: cluster_id
dtype: int64
- name: cluster_size
dtype: int64
- name: img_ids
sequence: int64
- name: img_cluster_scores
sequence: float64
splits:
- name: accountant
num_bytes: 1744
num_examples: 6
- name: aerospace_engineer
num_bytes: 1936
num_examples: 14
- name: aide
num_bytes: 1912
num_examples: 13
- name: air_conditioning_installer
num_bytes: 1720
num_examples: 5
- name: architect
num_bytes: 1816
num_examples: 9
- name: artist
num_bytes: 1936
num_examples: 14
- name: author
num_bytes: 1768
num_examples: 7
- name: baker
num_bytes: 1960
num_examples: 15
- name: bartender
num_bytes: 1696
num_examples: 4
- name: bus_driver
num_bytes: 1816
num_examples: 9
- name: butcher
num_bytes: 1768
num_examples: 7
- name: career_counselor
num_bytes: 1792
num_examples: 8
- name: carpenter
num_bytes: 1768
num_examples: 7
- name: carpet_installer
num_bytes: 1696
num_examples: 4
- name: cashier
num_bytes: 1888
num_examples: 12
- name: ceo
num_bytes: 1792
num_examples: 8
- name: childcare_worker
num_bytes: 1840
num_examples: 10
- name: civil_engineer
num_bytes: 1792
num_examples: 8
- name: claims_appraiser
num_bytes: 1744
num_examples: 6
- name: cleaner
num_bytes: 1888
num_examples: 12
- name: clergy
num_bytes: 1816
num_examples: 9
- name: clerk
num_bytes: 1984
num_examples: 16
- name: coach
num_bytes: 1768
num_examples: 7
- name: community_manager
num_bytes: 1792
num_examples: 8
- name: compliance_officer
num_bytes: 1816
num_examples: 9
- name: computer_programmer
num_bytes: 1720
num_examples: 5
- name: computer_support_specialist
num_bytes: 1792
num_examples: 8
- name: computer_systems_analyst
num_bytes: 1744
num_examples: 6
- name: construction_worker
num_bytes: 1768
num_examples: 7
- name: cook
num_bytes: 1864
num_examples: 11
- name: correctional_officer
num_bytes: 1816
num_examples: 9
- name: courier
num_bytes: 1984
num_examples: 16
- name: credit_counselor
num_bytes: 1864
num_examples: 11
- name: customer_service_representative
num_bytes: 1792
num_examples: 8
- name: data_entry_keyer
num_bytes: 1840
num_examples: 10
- name: dental_assistant
num_bytes: 1744
num_examples: 6
- name: dental_hygienist
num_bytes: 1696
num_examples: 4
- name: dentist
num_bytes: 1744
num_examples: 6
- name: designer
num_bytes: 1864
num_examples: 11
- name: detective
num_bytes: 1744
num_examples: 6
- name: director
num_bytes: 1816
num_examples: 9
- name: dishwasher
num_bytes: 1864
num_examples: 11
- name: dispatcher
num_bytes: 1768
num_examples: 7
- name: doctor
num_bytes: 1816
num_examples: 9
- name: drywall_installer
num_bytes: 1696
num_examples: 4
- name: electrical_engineer
num_bytes: 1816
num_examples: 9
- name: electrician
num_bytes: 1696
num_examples: 4
- name: engineer
num_bytes: 1744
num_examples: 6
- name: event_planner
num_bytes: 1720
num_examples: 5
- name: executive_assistant
num_bytes: 1696
num_examples: 4
- name: facilities_manager
num_bytes: 1792
num_examples: 8
- name: farmer
num_bytes: 1696
num_examples: 4
- name: fast_food_worker
num_bytes: 1912
num_examples: 13
- name: file_clerk
num_bytes: 1864
num_examples: 11
- name: financial_advisor
num_bytes: 1672
num_examples: 3
- name: financial_analyst
num_bytes: 1768
num_examples: 7
- name: financial_manager
num_bytes: 1768
num_examples: 7
- name: firefighter
num_bytes: 1696
num_examples: 4
- name: fitness_instructor
num_bytes: 1720
num_examples: 5
- name: graphic_designer
num_bytes: 1840
num_examples: 10
- name: groundskeeper
num_bytes: 1720
num_examples: 5
- name: hairdresser
num_bytes: 1792
num_examples: 8
- name: head_cook
num_bytes: 1840
num_examples: 10
- name: health_technician
num_bytes: 1840
num_examples: 10
- name: industrial_engineer
num_bytes: 1744
num_examples: 6
- name: insurance_agent
num_bytes: 1768
num_examples: 7
- name: interior_designer
num_bytes: 1840
num_examples: 10
- name: interviewer
num_bytes: 1840
num_examples: 10
- name: inventory_clerk
num_bytes: 1888
num_examples: 12
- name: it_specialist
num_bytes: 1720
num_examples: 5
- name: jailer
num_bytes: 1768
num_examples: 7
- name: janitor
num_bytes: 1840
num_examples: 10
- name: laboratory_technician
num_bytes: 1816
num_examples: 9
- name: language_pathologist
num_bytes: 1864
num_examples: 11
- name: lawyer
num_bytes: 1768
num_examples: 7
- name: librarian
num_bytes: 1840
num_examples: 10
- name: logistician
num_bytes: 1816
num_examples: 9
- name: machinery_mechanic
num_bytes: 1744
num_examples: 6
- name: machinist
num_bytes: 1768
num_examples: 7
- name: maid
num_bytes: 1768
num_examples: 7
- name: manager
num_bytes: 1768
num_examples: 7
- name: manicurist
num_bytes: 1792
num_examples: 8
- name: market_research_analyst
num_bytes: 1840
num_examples: 10
- name: marketing_manager
num_bytes: 1768
num_examples: 7
- name: massage_therapist
num_bytes: 1768
num_examples: 7
- name: mechanic
num_bytes: 1720
num_examples: 5
- name: mechanical_engineer
num_bytes: 1816
num_examples: 9
- name: medical_records_specialist
num_bytes: 1792
num_examples: 8
- name: mental_health_counselor
num_bytes: 1840
num_examples: 10
- name: metal_worker
num_bytes: 1696
num_examples: 4
- name: mover
num_bytes: 1888
num_examples: 12
- name: musician
num_bytes: 1840
num_examples: 10
- name: network_administrator
num_bytes: 1648
num_examples: 2
- name: nurse
num_bytes: 1744
num_examples: 6
- name: nursing_assistant
num_bytes: 1696
num_examples: 4
- name: nutritionist
num_bytes: 1768
num_examples: 7
- name: occupational_therapist
num_bytes: 1840
num_examples: 10
- name: office_clerk
num_bytes: 1840
num_examples: 10
- name: office_worker
num_bytes: 1864
num_examples: 11
- name: painter
num_bytes: 1864
num_examples: 11
- name: paralegal
num_bytes: 1792
num_examples: 8
- name: payroll_clerk
num_bytes: 1768
num_examples: 7
- name: pharmacist
num_bytes: 1864
num_examples: 11
- name: pharmacy_technician
num_bytes: 1744
num_examples: 6
- name: photographer
num_bytes: 1864
num_examples: 11
- name: physical_therapist
num_bytes: 1840
num_examples: 10
- name: pilot
num_bytes: 1816
num_examples: 9
- name: plane_mechanic
num_bytes: 1888
num_examples: 12
- name: plumber
num_bytes: 1720
num_examples: 5
- name: police_officer
num_bytes: 1768
num_examples: 7
- name: postal_worker
num_bytes: 1912
num_examples: 13
- name: printing_press_operator
num_bytes: 1816
num_examples: 9
- name: producer
num_bytes: 1888
num_examples: 12
- name: psychologist
num_bytes: 1768
num_examples: 7
- name: public_relations_specialist
num_bytes: 1720
num_examples: 5
- name: purchasing_agent
num_bytes: 1816
num_examples: 9
- name: radiologic_technician
num_bytes: 1768
num_examples: 7
- name: real_estate_broker
num_bytes: 1696
num_examples: 4
- name: receptionist
num_bytes: 1696
num_examples: 4
- name: repair_worker
num_bytes: 1744
num_examples: 6
- name: roofer
num_bytes: 1696
num_examples: 4
- name: sales_manager
num_bytes: 1624
num_examples: 1
- name: salesperson
num_bytes: 1696
num_examples: 4
- name: school_bus_driver
num_bytes: 1840
num_examples: 10
- name: scientist
num_bytes: 1888
num_examples: 12
- name: security_guard
num_bytes: 1744
num_examples: 6
- name: sheet_metal_worker
num_bytes: 1768
num_examples: 7
- name: singer
num_bytes: 1960
num_examples: 15
- name: social_assistant
num_bytes: 1864
num_examples: 11
- name: social_worker
num_bytes: 1936
num_examples: 14
- name: software_developer
num_bytes: 1672
num_examples: 3
- name: stocker
num_bytes: 1864
num_examples: 11
- name: supervisor
num_bytes: 1864
num_examples: 11
- name: taxi_driver
num_bytes: 1864
num_examples: 11
- name: teacher
num_bytes: 1912
num_examples: 13
- name: teaching_assistant
num_bytes: 1816
num_examples: 9
- name: teller
num_bytes: 1984
num_examples: 16
- name: therapist
num_bytes: 1840
num_examples: 10
- name: tractor_operator
num_bytes: 1696
num_examples: 4
- name: truck_driver
num_bytes: 1696
num_examples: 4
- name: tutor
num_bytes: 1984
num_examples: 16
- name: underwriter
num_bytes: 1744
num_examples: 6
- name: veterinarian
num_bytes: 1744
num_examples: 6
- name: welder
num_bytes: 1768
num_examples: 7
- name: wholesale_buyer
num_bytes: 1792
num_examples: 8
- name: writer
num_bytes: 1816
num_examples: 9
download_size: 635999
dataset_size: 262520
---
# Dataset Card for "prof_report__plasmo-vox2__multi__24"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ttabo/Dustin_Siu | ---
license: apache-2.0
---
|
jikaixuan/mistral_generated | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_chosen
dtype: float64
- name: score_rejected
dtype: float64
- name: generate_0
list:
- name: content
dtype: string
- name: role
dtype: string
- name: generate_1
list:
- name: content
dtype: string
- name: role
dtype: string
- name: generate_2
list:
- name: content
dtype: string
- name: role
dtype: string
- name: probability
sequence:
sequence: float64
splits:
- name: train
num_bytes: 854325595
num_examples: 61135
download_size: 475985779
dataset_size: 854325595
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "mistral_generated"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
trevin-wadu/npm3d-kitti-carla | ---
license: other
---
|
Vageesh1/Smart_Contract_HF_bc_succ | ---
dataset_info:
features:
- name: bytecode
dtype: string
- name: success
dtype: string
splits:
- name: train
num_bytes: 951795804
num_examples: 60000
download_size: 320350475
dataset_size: 951795804
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Smart_Contract_HF_bc_succ"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
juancopi81/orca-math-word-problems-130026_140028 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 12132319
num_examples: 10002
download_size: 4227963
dataset_size: 12132319
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
PrkhrAwsti/Twitter_Sentiment_3M | ---
license: apache-2.0
---
|
liuyanchen1015/MULTI_VALUE_mnli_a_ing | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 810269
num_examples: 3322
- name: dev_mismatched
num_bytes: 928761
num_examples: 3712
- name: test_matched
num_bytes: 841249
num_examples: 3473
- name: test_mismatched
num_bytes: 899925
num_examples: 3605
- name: train
num_bytes: 32720971
num_examples: 133743
download_size: 23435025
dataset_size: 36201175
---
# Dataset Card for "MULTI_VALUE_mnli_a_ing"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
houck2040/testpipe | ---
license: mit
---
|
dhuynh95/Evol-Instruct-Python-1k-20percent-backdoor | ---
license: apache-2.0
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4638216
num_examples: 853
download_size: 1917493
dataset_size: 4638216
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
marmofayezi/SDCelebA | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: mask
dtype: image
- name: caption
dtype: string
- name: landmark
dtype: image
- name: generated_image
dtype: image
splits:
- name: train
num_bytes: 2350548373.0
num_examples: 2998
download_size: 1972353294
dataset_size: 2350548373.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
multi-train/xsum_1107 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: query
dtype: string
- name: pos
sequence: string
- name: neg
sequence: string
- name: task
dtype: string
- name: instruction
struct:
- name: query
dtype: string
- name: pos
dtype: string
- name: neg
dtype: string
splits:
- name: train
num_bytes: 848430524
num_examples: 200000
download_size: 523334138
dataset_size: 848430524
---
# Dataset Card for "xsum_1107"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
trettenmeier/markt-pilot | ---
license: cc-by-sa-4.0
task_categories:
- text-classification
language:
- de
- en
tags:
- entity resolution
- product matching
pretty_name: Markt-Pilot Dataset for Product Matching
size_categories:
- 100K<n<1M
---
This dataset has an accompanying paper "Introducing a novel dataset for product matching: A new challenge for matching systems" that is accepted at The 3rd International Conference on Computers and Automation (CompAuto 2023) and will be published in IEEE Xplore.
The structure of the dataset is as follows: Each data point consists of a pair products and a binary label that indicates if these two product refer to the same real-world entity.
It consists of four subsets that differ in size and class distribution:
| Dataset |Data points | Negative | Positive | Imbalance Ratio |
|---|---:|---:|---:|---:|
| Full | 960,532| 665,831 | 294,701 | 2.3 |
| L | 243,954| 199,749 | 44,205 | 4.5 |
| M |66,556 | 59,925 | 6,631 | 9.0 |
| S | 18,973 |17,978 | 995 | 18.1 |
The test set consists of 5,000 manually checked data points and is shared across all four subsets.
|
liferecords/CROCS | ---
license: apache-2.0
---
|
Sujithanumala/Tokenized_Prop_Dataset | ---
dataset_info:
features:
- name: input_ids
sequence: int64
- name: attention_mask
sequence: int64
- name: token_type_ids
sequence: int64
- name: labels
sequence: string
splits:
- name: train
num_bytes: 18648636
num_examples: 1062
download_size: 924111
dataset_size: 18648636
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
guillaumeai/gia-ds-ap2404_v1 | ---
license: cc-by-nc-4.0
tags:
- jgisabelleart
- art
- abstract
--- |
GHOFRANEE/imac_ocr | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 161619
num_examples: 15
download_size: 81428
dataset_size: 161619
---
# Dataset Card for "imac_ocr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Vikhrmodels/Veles-2.5 | ---
language:
- ru
configs:
- config_name: default
data_files:
- split: train
path: "hermes_ru.jsonl"
---

# Veles-2.5
OpenHermes-2.5 samples translated into Russian using GPT-3.5 and GPT-4.
TRANSLATED, all creds to authors https://huggingface.co/datasets/teknium/OpenHermes-2.5 |
abdusah/masc_dev | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- ar
license:
- cc-by-nc-4.0
multilinguality: []
paperswithcode_id: []
pretty_name: 'MASC'
size_categories:
source_datasets: []
task_categories: []
task_ids: []
---
# Dataset Card for MASC: MASSIVE ARABIC SPEECH CORPUS
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://ieee-dataport.org/open-access/masc-massive-arabic-speech-corpus
- **Repository:**
- **Paper:** https://dx.doi.org/10.21227/e1qb-jv46
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This corpus is a dataset that contains 1,000 hours of speech sampled at 16~kHz and crawled from over 700 YouTube channels. MASC is multi-regional, multi-genre, and multi-dialect dataset that is intended to advance the research and development of Arabic speech technology with the special emphasis on Arabic speech recognition
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
Multi-dialect Arabic
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
#### masc_dev
- speech
- sampling_rate
- target_text (label)
### Data Splits
#### masc_dev
- train: 100
- test: 40
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
Note: this is a small development set for testing.
### Dataset Curators
[More Information Needed]
### Licensing Information
CC 4.0
### Citation Information
[More Information Needed]
### Contributions
Mohammad Al-Fetyani, Muhammad Al-Barham, Gheith Abandah, Adham Alsharkawi, Maha Dawas, August 18, 2021, "MASC: Massive Arabic Speech Corpus", IEEE Dataport, doi: https://dx.doi.org/10.21227/e1qb-jv46.
|
baber/pdbooks | ---
configs:
- config_name: registrations_all
data_files: "registrations_all.parquet"
- config_name: registrations_not_renewed
data_files: "registrations_not_renewed.parquet"
- config_name: renewals_matched
data_files: "renewals-with-regs.parquet"
- config_name: renewals_unmatched
data_files: "renewals-no-regs.parquet"
# - config_name: renewals
# data_files: "renewals-unmatched.parquet"
# - config_name: not_renewed
# data_files: "FINAL-not-renewed.csv"
# - config_name: previously_published
# data_files: "FINAL-previously-published.csv"
# - config_name: probably_renewed
# data_files: "FINAL-probably-renewed.csv"
# - config_name: possibly_renewed
# data_files: "FINAL-possibly-renewed.csv"
# - config_name: too_early
# data_files: "FINAL-too-early.csv"
# - config_name: too_late
# data_files: "FINAL-too-late.csv"
# - config_name: not_books_proper
# data_files: "FINAL-not-books-proper.csv"
# - config_name: foreign
# data_files: "FINAL-foreign.csv"
# - config_name: renewed
# data_files: "FINAL-renewed.csv"
# - config_name: error
# data_files: "FINAL-error.csv"
license: cc0-1.0
---
|
nateraw/fuego-20230213-195827-b56398 | ---
tags:
- fuego
fuego:
id: 20230213-195827-b56398
status: preparing
script: main.py
requirements_file: requirements.txt
space_id: nateraw/fuego-20230213-195827-b56398
space_hardware: cpu-basic
github_repo_id: pytorch/examples
github_repo_branch: main
github_repo_sha: e4e8da8467d55d28920dbd137261d82255f68c71
---
|
LLukas22/nq | ---
license: cc-by-sa-3.0
task_categories:
- sentence-similarity
- feature-extraction
language:
- en
---
# Dataset Card for "nq"
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
## Dataset Description
- **Homepage:** [https://ai.google.com/research/NaturalQuestions](https://ai.google.com/research/NaturalQuestions)
### Dataset Summary
This is a modified version of the original Natural Questions (nq) dataset for retrieval tasks. The original is availabe [here](https://ai.google.com/research/NaturalQuestions).
It contains google queries and an entire stripped wikipedia article for each query.
## Dataset Structure
### Data Instances
An example of 'train' looks as follows.
```json
{
"question": "who competes in miss universe miss america or miss usa",
"context": "Miss USA - Wikipedia\nThe Miss USA is an American beauty pageant that has been held annually since 1952 to select the Amer ...",
}
```
### Data Fields
The data fields are the same among all splits.
- `question`: a `string` feature.
- `context`: a `string` feature.
## Additional Information
### Licensing Information
This dataset is distributed under the cc-by-sa-3.0 license. |
johnpaulbin/autotrain-data-english-tokipona | ---
task_categories:
- translation
---
# AutoTrain Dataset for project: english-tokipona
## Dataset Description
This dataset has been automatically processed by AutoTrain for project english-tokipona.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"target": "mi kama jo e pali ante.",
"source": "I'll find another job."
},
{
"target": "tenpo pini weka la mi moku lon poka sina.",
"source": "It's been a while since we've had lunch together."
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"target": "Value(dtype='string', id=None)",
"source": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 16824 |
| valid | 4206 |
|
cahya/instructions-fr | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 32499221.675393682
num_examples: 73841
- name: test
num_bytes: 855601.7244750926
num_examples: 1944
- name: validation
num_bytes: 855161.6001312268
num_examples: 1943
download_size: 19462874
dataset_size: 34209985.0
---
# Dataset Card for "instructions-fr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/393_Hours_Korean_Children_Speech_Data_by_Mobile_Phone | ---
license: cc-by-nc-nd-4.0
---
## Description
Korean(Korea) Children Scripted Monologue Smartphone speech dataset, collected from monologue based on given scripts, covering essay stories, and numbers. Transcribed with text content and other attributes. Our dataset was collected from extensive and diversify speakers, geographicly speaking, enhancing model performance in real and complex tasks.rnQuality tested by various AI companies. We strictly adhere to data protection regulations and privacy standards, ensuring the maintenance of user privacy and legal rights throughout the data collection, storage, and usage processes, our datasets are all GDPR, CCPA, PIPL complied.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1112?source=Huggingface
# Specifications
## Format
16kHz, 16bit, uncompressed wav, mono channel
## Recording environment
quiet indoor environment, without echo
## Recording content (read speech)
children's books; human-machine interaction category; smart home command and control category; numbers; general category
## Speaker
1,085 Korean children, all children are 6-15 years old
## Recording device
Android Smartphone, iPhone
## Country
Korea
## Language
Korean
## Accuracy rate
Sentence Accuracy Rate (SAR) 95%
# Licensing Information
Commercial License
|
tilemachos/Demo-Dataset | ---
license: unlicense
---
|
Cognitive-Lab/Aya_Tamil | ---
dataset_info:
- config_name: complete_dataset
features:
- name: id
dtype: int64
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: dataset_name
dtype: string
- name: language
dtype: string
- name: template_id
dtype: int64
- name: targets
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: sub_dataset_name
dtype: string
splits:
- name: train
num_bytes: 4808907428
num_examples: 3582574
download_size: 1449098594
dataset_size: 4808907428
- config_name: templated_indic_sentiment
features:
- name: id
dtype: int64
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: dataset_name
dtype: string
- name: language
dtype: string
- name: template_id
dtype: int64
- name: targets
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: sub_dataset_name
dtype: string
splits:
- name: train
num_bytes: 821843
num_examples: 1156
download_size: 321367
dataset_size: 821843
- config_name: templated_tamil_stories
features:
- name: id
dtype: int64
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: dataset_name
dtype: string
- name: language
dtype: string
- name: template_id
dtype: int64
- name: targets
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: sub_dataset_name
dtype: string
splits:
- name: train
num_bytes: 14555943
num_examples: 1202
download_size: 4886244
dataset_size: 14555943
- config_name: templated_tamil_thirukkural
features:
- name: id
dtype: int64
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: dataset_name
dtype: string
- name: language
dtype: string
- name: template_id
dtype: int64
- name: targets
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: sub_dataset_name
dtype: string
splits:
- name: train
num_bytes: 7722387
num_examples: 3990
download_size: 1430774
dataset_size: 7722387
- config_name: templated_xlel_wd
features:
- name: id
dtype: int64
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: dataset_name
dtype: string
- name: language
dtype: string
- name: template_id
dtype: int64
- name: targets
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: sub_dataset_name
dtype: string
splits:
- name: train
num_bytes: 6423580
num_examples: 3861
download_size: 2194151
dataset_size: 6423580
- config_name: translated_adversarial_qa
features:
- name: id
dtype: int64
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: dataset_name
dtype: string
- name: language
dtype: string
- name: template_id
dtype: int64
- name: targets
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: sub_dataset_name
dtype: string
splits:
- name: train
num_bytes: 27100085
num_examples: 10000
download_size: 6134878
dataset_size: 27100085
- config_name: translated_cnn_dailymail
features:
- name: id
dtype: int64
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: dataset_name
dtype: string
- name: language
dtype: string
- name: template_id
dtype: int64
- name: targets
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: sub_dataset_name
dtype: string
splits:
- name: train
num_bytes: 725255684
num_examples: 100000
download_size: 245351920
dataset_size: 725255684
- config_name: translated_dolly
features:
- name: id
dtype: int64
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: dataset_name
dtype: string
- name: language
dtype: string
- name: template_id
dtype: int64
- name: targets
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: sub_dataset_name
dtype: string
splits:
- name: train
num_bytes: 36095149
num_examples: 14808
download_size: 12765749
dataset_size: 36095149
- config_name: translated_flan_coqa
features:
- name: id
dtype: int64
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: dataset_name
dtype: string
- name: language
dtype: string
- name: template_id
dtype: int64
- name: targets
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: sub_dataset_name
dtype: string
splits:
- name: train
num_bytes: 47190884
num_examples: 6409
download_size: 16643233
dataset_size: 47190884
- config_name: translated_flan_cot
features:
- name: id
dtype: int64
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: dataset_name
dtype: string
- name: language
dtype: string
- name: template_id
dtype: int64
- name: targets
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: sub_dataset_name
dtype: string
splits:
- name: train
num_bytes: 116616264
num_examples: 91910
download_size: 37692925
dataset_size: 116616264
- config_name: translated_flan_gem_wiki
features:
- name: id
dtype: int64
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: dataset_name
dtype: string
- name: language
dtype: string
- name: template_id
dtype: int64
- name: targets
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: sub_dataset_name
dtype: string
splits:
- name: train
num_bytes: 185434422
num_examples: 27147
download_size: 60004679
dataset_size: 185434422
- config_name: translated_flan_lambada
features:
- name: id
dtype: int64
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: dataset_name
dtype: string
- name: language
dtype: string
- name: template_id
dtype: int64
- name: targets
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: sub_dataset_name
dtype: string
splits:
- name: train
num_bytes: 3401706
num_examples: 4279
download_size: 1159040
dataset_size: 3401706
- config_name: translated_flan_qa
features:
- name: id
dtype: int64
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: dataset_name
dtype: string
- name: language
dtype: string
- name: template_id
dtype: int64
- name: targets
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: sub_dataset_name
dtype: string
splits:
- name: train
num_bytes: 499647
num_examples: 540
download_size: 165535
dataset_size: 499647
- config_name: translated_hotpotqa
features:
- name: id
dtype: int64
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: dataset_name
dtype: string
- name: language
dtype: string
- name: template_id
dtype: int64
- name: targets
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: sub_dataset_name
dtype: string
splits:
- name: train
num_bytes: 193362032
num_examples: 355476
download_size: 53713487
dataset_size: 193362032
- config_name: translated_joke_explaination
features:
- name: id
dtype: int64
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: dataset_name
dtype: string
- name: language
dtype: string
- name: template_id
dtype: int64
- name: targets
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: sub_dataset_name
dtype: string
splits:
- name: train
num_bytes: 1634648
num_examples: 754
download_size: 329076
dataset_size: 1634648
- config_name: translated_mintaka
features:
- name: id
dtype: int64
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: dataset_name
dtype: string
- name: language
dtype: string
- name: template_id
dtype: int64
- name: targets
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: sub_dataset_name
dtype: string
splits:
- name: train
num_bytes: 6282071
num_examples: 14000
download_size: 1012425
dataset_size: 6282071
- config_name: translated_nqopen
features:
- name: id
dtype: int64
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: dataset_name
dtype: string
- name: language
dtype: string
- name: template_id
dtype: int64
- name: targets
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: sub_dataset_name
dtype: string
splits:
- name: train
num_bytes: 60333480
num_examples: 175850
download_size: 15785046
dataset_size: 60333480
- config_name: translated_paws
features:
- name: id
dtype: int64
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: dataset_name
dtype: string
- name: language
dtype: string
- name: template_id
dtype: int64
- name: targets
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: sub_dataset_name
dtype: string
splits:
- name: train
num_bytes: 53414859
num_examples: 49401
download_size: 6714953
dataset_size: 53414859
- config_name: translated_piqa
features:
- name: id
dtype: int64
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: dataset_name
dtype: string
- name: language
dtype: string
- name: template_id
dtype: int64
- name: targets
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: sub_dataset_name
dtype: string
splits:
- name: train
num_bytes: 20835853
num_examples: 16113
download_size: 5428019
dataset_size: 20835853
- config_name: translated_soda
features:
- name: id
dtype: int64
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: dataset_name
dtype: string
- name: language
dtype: string
- name: template_id
dtype: int64
- name: targets
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: sub_dataset_name
dtype: string
splits:
- name: train
num_bytes: 1134110413
num_examples: 1191582
download_size: 300845773
dataset_size: 1134110413
- config_name: translated_wiki_split
features:
- name: id
dtype: int64
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: dataset_name
dtype: string
- name: language
dtype: string
- name: template_id
dtype: int64
- name: targets
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: sub_dataset_name
dtype: string
splits:
- name: train
num_bytes: 1194234452
num_examples: 989944
download_size: 338917207
dataset_size: 1194234452
- config_name: translated_wikiqa
features:
- name: id
dtype: int64
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: dataset_name
dtype: string
- name: language
dtype: string
- name: template_id
dtype: int64
- name: targets
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: sub_dataset_name
dtype: string
splits:
- name: train
num_bytes: 832810
num_examples: 1040
download_size: 280200
dataset_size: 832810
- config_name: translated_xlel_wd
features:
- name: id
dtype: int64
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: dataset_name
dtype: string
- name: language
dtype: string
- name: template_id
dtype: int64
- name: targets
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: sub_dataset_name
dtype: string
splits:
- name: train
num_bytes: 972749216
num_examples: 523112
download_size: 337128275
dataset_size: 972749216
configs:
- config_name: complete_dataset
data_files:
- split: train
path: complete_dataset/train-*
- config_name: templated_indic_sentiment
data_files:
- split: train
path: templated_indic_sentiment/train-*
- config_name: templated_tamil_stories
data_files:
- split: train
path: templated_tamil_stories/train-*
- config_name: templated_tamil_thirukkural
data_files:
- split: train
path: templated_tamil_thirukkural/train-*
- config_name: templated_xlel_wd
data_files:
- split: train
path: templated_xlel_wd/train-*
- config_name: translated_adversarial_qa
data_files:
- split: train
path: translated_adversarial_qa/train-*
- config_name: translated_cnn_dailymail
data_files:
- split: train
path: translated_cnn_dailymail/train-*
- config_name: translated_dolly
data_files:
- split: train
path: translated_dolly/train-*
- config_name: translated_flan_coqa
data_files:
- split: train
path: translated_flan_coqa/train-*
- config_name: translated_flan_cot
data_files:
- split: train
path: translated_flan_cot/train-*
- config_name: translated_flan_gem_wiki
data_files:
- split: train
path: translated_flan_gem_wiki/train-*
- config_name: translated_flan_lambada
data_files:
- split: train
path: translated_flan_lambada/train-*
- config_name: translated_flan_qa
data_files:
- split: train
path: translated_flan_qa/train-*
- config_name: translated_hotpotqa
data_files:
- split: train
path: translated_hotpotqa/train-*
- config_name: translated_joke_explaination
data_files:
- split: train
path: translated_joke_explaination/train-*
- config_name: translated_mintaka
data_files:
- split: train
path: translated_mintaka/train-*
- config_name: translated_nqopen
data_files:
- split: train
path: translated_nqopen/train-*
- config_name: translated_paws
data_files:
- split: train
path: translated_paws/train-*
- config_name: translated_piqa
data_files:
- split: train
path: translated_piqa/train-*
- config_name: translated_soda
data_files:
- split: train
path: translated_soda/train-*
- config_name: translated_wiki_split
data_files:
- split: train
path: translated_wiki_split/train-*
- config_name: translated_wikiqa
data_files:
- split: train
path: translated_wikiqa/train-*
- config_name: translated_xlel_wd
data_files:
- split: train
path: translated_xlel_wd/train-*
license: apache-2.0
language:
- en
- ta
---
# Aya_Tamil
This Dataset is curated from the original [Aya-Collection](https://huggingface.co/datasets/CohereForAI/aya_collection) dataset that was open-sourced by [Cohere](https://cohere.com/research) under the [Apache-2.0](https://choosealicense.com/licenses/apache-2.0/) license.
The Aya Collection is a massive multilingual collection comprising 513 million instances of prompts and completions that cover a wide range of tasks. This collection uses instruction-style templates from fluent speakers and applies them to a curated list of datasets. It also includes translations of instruction-style datasets into 101 languages. The Aya Dataset, a human-curated multilingual instruction and response dataset, is part of this collection. Refer to our paper for more details about the collection.
### Motivations & Intentions
The original dataset is large and more task-specific than language-specific. To carry out a task specific to the Indic language, one would previously have needed to download the entire dataset (~600 GB) and filter it.
As we were training an Indic LLm internally, we filtered the dataset by language and curated this dataset.
You can find all the Indic-language specific datasets - [here](https://huggingface.co/collections/Cognitive-Lab/aya-indic-suite-65eaa0e34a2307f30bbd55e5).
## **Data Instances**
An example of a `train` instance looks as follows:
```yaml
{'id': 246001,
'inputs': 'The following query in English is taken from the geography category. What could be the answer to the question?\nWhat is the seventh tallest mountain in North America?',
'targets': 'The answer is Mount Lucania.',
'dataset_name': 'Mintaka-inst',
'sub_dataset_name': '-',
'task_type': 'question-answering',
'template_id': 3,
'language': 'eng',
'split': 'train',
'script': 'Latn'
}
```
## **Data Fields**
The data fields are the same among all splits:
- `id:` Unique id of the data point
- `inputs:` Prompt or input to the language model.
- `targets:` Completion or output of the language model.
- `dataset_name:` The name of the source dataset that the data point was taken from
- `sub_dataset_name:` If the source is a collection, this field indicates which part of that collection the data point was taken from. If it is not a collection, this field is left blank.
- `task_type:` The task type that this conversation belongs to.
- `template_id`: The id of the template applied to this data point.
- `language:` The ISO code of the dialect of the conversation.
- `script:` The script of the language.
- `split:` Indicates whether the data point is part of the `train` or the `test` split.
## **Licensing Information**
This dataset can be used for any purpose, whether academic or commercial, under the terms of the **[Apache 2.0](https://opensource.org/license/apache-2-0)** License.
Citation
```yaml
@misc{singh2024aya,
title={Aya Dataset: An Open-Access Collection for Multilingual Instruction Tuning},
author={Shivalika Singh and Freddie Vargus and Daniel Dsouza and Börje F. Karlsson and Abinaya Mahendiran and Wei-Yin Ko and Herumb Shandilya and Jay Patel and Deividas Mataciunas and Laura OMahony and Mike Zhang and Ramith Hettiarachchi and Joseph Wilson and Marina Machado and Luisa Souza Moura and Dominik Krzemiński and Hakimeh Fadaei and Irem Ergün and Ifeoma Okoh and Aisha Alaagib and Oshan Mudannayake and Zaid Alyafeai and Vu Minh Chien and Sebastian Ruder and Surya Guthikonda and Emad A. Alghamdi and Sebastian Gehrmann and Niklas Muennighoff and Max Bartolo and Julia Kreutzer and Ahmet Üstün and Marzieh Fadaee and Sara Hooker},
year={2024},
eprint={2402.06619},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
liuyanchen1015/MULTI_VALUE_cola_volition_changes | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 150
num_examples: 3
- name: test
num_bytes: 211
num_examples: 4
- name: train
num_bytes: 2867
num_examples: 38
download_size: 7112
dataset_size: 3228
---
# Dataset Card for "MULTI_VALUE_cola_volition_changes"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rakshit122/zavvv11 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence: string
splits:
- name: train
num_bytes: 46270
num_examples: 226
download_size: 16707
dataset_size: 46270
---
# Dataset Card for "zavvv11"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
openlifescienceai/medqa | ---
dataset_info:
features:
- name: id
dtype: string
- name: data
struct:
- name: Correct Answer
dtype: string
- name: Correct Option
dtype: string
- name: Options
struct:
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: Question
dtype: string
- name: subject_name
dtype: string
splits:
- name: train
num_bytes: 9503782
num_examples: 10178
- name: test
num_bytes: 1215843
num_examples: 1273
- name: dev
num_bytes: 1189300
num_examples: 1272
download_size: 7137982
dataset_size: 11908925
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: dev
path: data/dev-*
---
|
distilled-from-one-sec-cv12/chunk_100 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1300125484
num_examples: 253337
download_size: 1329085369
dataset_size: 1300125484
---
# Dataset Card for "chunk_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hironow/aya_collection_japanese_test | ---
size_categories: 100K<n<1M
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for aya_collection_japanese_test
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("hironow/aya_collection_japanese_test")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("hironow/aya_collection_japanese_test")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| question | Question | text | True | False |
| answer | Answer | text | True | False |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| answer_quality | Answer_quality | rating | True | How would you rate the quality of the answer? (日本語訳: 回答の質をどのように評価しますか?) | [1, 2, 3, 4, 5] |
| answer_correction | Answer_correction | text | False | If you think the answer is not accurate, please, correct it. (日本語訳: 回答が正確でないと思う場合は、修正してください。) | N/A |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
| Metadata Name | Title | Type | Values | Visible for Annotators |
| ------------- | ----- | ---- | ------ | ---------------------- |
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": null,
"fields": {
"answer": "1700\u5e74,\u6570\u767e\u4eba\u306e\u30d5\u30e9\u30f3\u30b9\u4eba\u30d2\u30e5\u30fc\u30b4\u30ce\u30c3\u30c8\u306f,\u30a4\u30f3\u30b0\u30e9\u30f3\u30c9\u304b\u3089\u30d0\u30fc\u30b8\u30cb\u30a2\u690d\u6c11\u5730\u306b\u79fb\u4f4f\u3057,\u305d\u3053\u3067\u30a4\u30f3\u30b0\u30e9\u30f3\u30c9\u738b\u5ba4\u306f,\u4e0b\u30ce\u30eb\u30d5\u30a9\u30fc\u30af\u90e1\u306e\u571f\u5730\u306e\u63d0\u4f9b\u3092\u7d04\u675f\u3057\u305f. \u5230\u7740\u3059\u308b\u3068 \u690d\u6c11\u5730\u5f53\u5c40\u306f \u5f7c\u3089\u306b\u4ee3\u308f\u308a\u306b \u30b8\u30a7\u30fc\u30e0\u30ba\u5ddd\u306e\u6edd\u304b\u308920\u30de\u30a4\u30eb\u4e0a\u306b\u3042\u308b \u653e\u68c4\u3055\u308c\u305f\u30e2\u30ca\u30ab\u30f3\u306e\u6751 \u30de\u30ca\u30ad\u30f3\u30bf\u30a6\u30f3\u3067 \u571f\u5730\u3092\u63d0\u4f9b\u3057\u307e\u3057\u305f \u73fe\u5728\u30dd\u30a6\u30cf\u30bf\u30f3\u90e1\u3067\u3059 \u73fe\u5728\u306e\u30c1\u30a7\u30b9\u30bf\u30fc\u30d5\u30a3\u30fc\u30eb\u30c9\u90e1\u306b\u5b9a\u4f4f\u8005\u304c\u7740\u9678\u3057\u305f. 1705\u5e745\u670812\u65e5,\u30d0\u30fc\u30b8\u30cb\u30a2\u5dde\u8b70\u4f1a\u306f,\u30de\u30ca\u30ad\u30f3\u30c8\u30a6\u30f3\u306e148\u4eba\u306e\u30d2\u30e5\u30fc\u30b4\u30ce\u30c3\u30c8\u306e\u5165\u7c4d\u3092\u8a8d\u3081\u308b\u6cd5\u6848\u3092\u53ef\u6c7a\u3057\u305f. \u3053\u306e \u5b64\u7acb \u3057 \u305f \u5c45\u4f4f \u533a \u306b \u4f4f\u3093 \u3067 \u3044 \u305f \u6700\u521d \u306e 390 \u4eba \u306e \u3046\u3061,\u591a\u304f \u306e \u4eba \u306f \u6b7b\u3093 \u3067 \u3044 \u307e\u3057 \u305f.\u307b\u304b \u306e \u4eba \u306f \u753a \u306e \u5916 \u3067,\u82f1\u56fd \u306e \u69d8\u5f0f \u306e \u8fb2\u5834 \u306b \u4f4f\u3093 \u3067 \u3044 \u307e\u3057 \u305f.\u307b\u304b \u306e \u4eba \u306f,\u5225 \u306e \u5730\u57df \u306b \u79fb\u308a\u4f4f\u3093 \u3067 \u3044 \u307e\u3057 \u305f. \u5f90\u3005\u306b,\u5f7c\u3089\u306f\u5f7c\u3089\u306e\u30a4\u30ae\u30ea\u30b9\u4eba\u96a3\u4eba\u3068\u7d50\u5a5a\u3057\u305f. 18\u4e16\u7d00\u304b\u308919\u4e16\u7d00\u306b\u304b\u3051\u3066,\u30d5\u30e9\u30f3\u30b9\u4eba\u306e\u5b50\u5b6b\u306f,\u897f\u306b\u30d4\u30a8\u30c9\u30e2\u30f3\u30c8\u306b,\u305d\u3057\u3066\u30a2\u30d1\u30e9\u30c1\u30a2\u5c71\u8108\u3092\u8d8a\u3048\u3066,\u30b1\u30f3\u30bf\u30c3\u30ad\u30fc\u5dde,\u30c6\u30cd\u30b7\u30fc\u5dde,\u30df\u30ba\u30fc\u30ea\u5dde,\u305d\u306e\u4ed6\u306e\u5dde\u3068\u306a\u3063\u305f\u897f\u90e8\u306b\u79fb\u4f4f\u3057\u307e\u3057\u305f. \u30de\u30ca\u30ad\u30f3\u30bf\u30a6\u30f3\u5730\u533a\u3067\u306f,\u30b8\u30a7\u30fc\u30e0\u30ba\u5ddd\u3092\u6e21\u308b\u30d2\u30e5\u30fc\u30b4\u30ce\u30c3\u30c8\u8a18\u5ff5\u6a4b\u3068\u30d2\u30e5\u30fc\u30b4\u30ce\u30c3\u30c8\u9053\u8def\u304c\u5f7c\u3089\u306e\u540d\u8a89\u306b\u547d\u540d\u3055\u308c,\u30d2\u30e5\u30fc\u30b4\u30ce\u30c3\u30c8\u9ad8\u6821\u3092\u542b\u3080\u3044\u304f\u3064\u304b\u306e\u5b66\u6821\u3092\u542b\u3080\u591a\u304f\u306e\u5730\u5143\u306e\u7279\u5fb4\u304c\u547d\u540d\u3055\u308c\u305f.",
"question": "242\u4eba\u306e\u30d2\u30e5\u30fc\u30b2\u30ce\u30c3\u30c8\u306f\u3082\u306f\u3084\u30de\u30ca\u30ad\u30f3\u30bf\u30a6\u30f3\u306b\u4f4f\u3093\u3067\u3044\u307e\u305b\u3093\u3067\u3057\u305f. \u3069\u3046\u306a\u3063\u305f\u306e? \u524d\u56de\u306e\u8cea\u554f\u3092\u8003\u3048\u308b\u3068,\u305d\u306e\u7b54\u3048\u3092\u542b\u3080\u6587\u8108\u3092\u66f8\u3044\u3066\u304f\u3060\u3055\u3044. 1\u304b\u308920\u6587\u306e\u6587\u7ae0\u3067\u3059 \u6587\u8108:"
},
"metadata": {},
"responses": [],
"suggestions": [],
"vectors": {}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"answer": "1700\u5e74,\u6570\u767e\u4eba\u306e\u30d5\u30e9\u30f3\u30b9\u4eba\u30d2\u30e5\u30fc\u30b4\u30ce\u30c3\u30c8\u306f,\u30a4\u30f3\u30b0\u30e9\u30f3\u30c9\u304b\u3089\u30d0\u30fc\u30b8\u30cb\u30a2\u690d\u6c11\u5730\u306b\u79fb\u4f4f\u3057,\u305d\u3053\u3067\u30a4\u30f3\u30b0\u30e9\u30f3\u30c9\u738b\u5ba4\u306f,\u4e0b\u30ce\u30eb\u30d5\u30a9\u30fc\u30af\u90e1\u306e\u571f\u5730\u306e\u63d0\u4f9b\u3092\u7d04\u675f\u3057\u305f. \u5230\u7740\u3059\u308b\u3068 \u690d\u6c11\u5730\u5f53\u5c40\u306f \u5f7c\u3089\u306b\u4ee3\u308f\u308a\u306b \u30b8\u30a7\u30fc\u30e0\u30ba\u5ddd\u306e\u6edd\u304b\u308920\u30de\u30a4\u30eb\u4e0a\u306b\u3042\u308b \u653e\u68c4\u3055\u308c\u305f\u30e2\u30ca\u30ab\u30f3\u306e\u6751 \u30de\u30ca\u30ad\u30f3\u30bf\u30a6\u30f3\u3067 \u571f\u5730\u3092\u63d0\u4f9b\u3057\u307e\u3057\u305f \u73fe\u5728\u30dd\u30a6\u30cf\u30bf\u30f3\u90e1\u3067\u3059 \u73fe\u5728\u306e\u30c1\u30a7\u30b9\u30bf\u30fc\u30d5\u30a3\u30fc\u30eb\u30c9\u90e1\u306b\u5b9a\u4f4f\u8005\u304c\u7740\u9678\u3057\u305f. 1705\u5e745\u670812\u65e5,\u30d0\u30fc\u30b8\u30cb\u30a2\u5dde\u8b70\u4f1a\u306f,\u30de\u30ca\u30ad\u30f3\u30c8\u30a6\u30f3\u306e148\u4eba\u306e\u30d2\u30e5\u30fc\u30b4\u30ce\u30c3\u30c8\u306e\u5165\u7c4d\u3092\u8a8d\u3081\u308b\u6cd5\u6848\u3092\u53ef\u6c7a\u3057\u305f. \u3053\u306e \u5b64\u7acb \u3057 \u305f \u5c45\u4f4f \u533a \u306b \u4f4f\u3093 \u3067 \u3044 \u305f \u6700\u521d \u306e 390 \u4eba \u306e \u3046\u3061,\u591a\u304f \u306e \u4eba \u306f \u6b7b\u3093 \u3067 \u3044 \u307e\u3057 \u305f.\u307b\u304b \u306e \u4eba \u306f \u753a \u306e \u5916 \u3067,\u82f1\u56fd \u306e \u69d8\u5f0f \u306e \u8fb2\u5834 \u306b \u4f4f\u3093 \u3067 \u3044 \u307e\u3057 \u305f.\u307b\u304b \u306e \u4eba \u306f,\u5225 \u306e \u5730\u57df \u306b \u79fb\u308a\u4f4f\u3093 \u3067 \u3044 \u307e\u3057 \u305f. \u5f90\u3005\u306b,\u5f7c\u3089\u306f\u5f7c\u3089\u306e\u30a4\u30ae\u30ea\u30b9\u4eba\u96a3\u4eba\u3068\u7d50\u5a5a\u3057\u305f. 18\u4e16\u7d00\u304b\u308919\u4e16\u7d00\u306b\u304b\u3051\u3066,\u30d5\u30e9\u30f3\u30b9\u4eba\u306e\u5b50\u5b6b\u306f,\u897f\u306b\u30d4\u30a8\u30c9\u30e2\u30f3\u30c8\u306b,\u305d\u3057\u3066\u30a2\u30d1\u30e9\u30c1\u30a2\u5c71\u8108\u3092\u8d8a\u3048\u3066,\u30b1\u30f3\u30bf\u30c3\u30ad\u30fc\u5dde,\u30c6\u30cd\u30b7\u30fc\u5dde,\u30df\u30ba\u30fc\u30ea\u5dde,\u305d\u306e\u4ed6\u306e\u5dde\u3068\u306a\u3063\u305f\u897f\u90e8\u306b\u79fb\u4f4f\u3057\u307e\u3057\u305f. \u30de\u30ca\u30ad\u30f3\u30bf\u30a6\u30f3\u5730\u533a\u3067\u306f,\u30b8\u30a7\u30fc\u30e0\u30ba\u5ddd\u3092\u6e21\u308b\u30d2\u30e5\u30fc\u30b4\u30ce\u30c3\u30c8\u8a18\u5ff5\u6a4b\u3068\u30d2\u30e5\u30fc\u30b4\u30ce\u30c3\u30c8\u9053\u8def\u304c\u5f7c\u3089\u306e\u540d\u8a89\u306b\u547d\u540d\u3055\u308c,\u30d2\u30e5\u30fc\u30b4\u30ce\u30c3\u30c8\u9ad8\u6821\u3092\u542b\u3080\u3044\u304f\u3064\u304b\u306e\u5b66\u6821\u3092\u542b\u3080\u591a\u304f\u306e\u5730\u5143\u306e\u7279\u5fb4\u304c\u547d\u540d\u3055\u308c\u305f.",
"answer_correction": [],
"answer_correction-suggestion": null,
"answer_correction-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"answer_quality": [],
"answer_quality-suggestion": null,
"answer_quality-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"external_id": null,
"metadata": "{}",
"question": "242\u4eba\u306e\u30d2\u30e5\u30fc\u30b2\u30ce\u30c3\u30c8\u306f\u3082\u306f\u3084\u30de\u30ca\u30ad\u30f3\u30bf\u30a6\u30f3\u306b\u4f4f\u3093\u3067\u3044\u307e\u305b\u3093\u3067\u3057\u305f. \u3069\u3046\u306a\u3063\u305f\u306e? \u524d\u56de\u306e\u8cea\u554f\u3092\u8003\u3048\u308b\u3068,\u305d\u306e\u7b54\u3048\u3092\u542b\u3080\u6587\u8108\u3092\u66f8\u3044\u3066\u304f\u3060\u3055\u3044. 1\u304b\u308920\u6587\u306e\u6587\u7ae0\u3067\u3059 \u6587\u8108:"
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **question** is of type `text`.
* **answer** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **answer_quality** is of type `rating` with the following allowed values [1, 2, 3, 4, 5], and description "How would you rate the quality of the answer? (日本語訳: 回答の質をどのように評価しますか?)".
* (optional) **answer_correction** is of type `text`, and description "If you think the answer is not accurate, please, correct it. (日本語訳: 回答が正確でないと思う場合は、修正してください。)".
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **answer_quality-suggestion** is of type `rating` with the following allowed values [1, 2, 3, 4, 5].
* (optional) **answer_correction-suggestion** is of type `text`.
Additionally, we also have two more fields that are optional and are the following:
* **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
Please, read the question carefully and try to answer it as accurately as possible. (日本語訳: 質問をよく読んで、できるだけ正確に回答してください。)
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
growth-cadet/jobpost_signals-to-JSON03 | ---
dataset_info:
features:
- name: id
dtype: string
- name: ats
dtype: string
- name: context
dtype: string
- name: gpt-4_sys5_response
dtype: string
- name: gpt-4_sys5_cost
dtype: float64
splits:
- name: train
num_bytes: 13349123
num_examples: 2039
download_size: 6859646
dataset_size: 13349123
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_andysalerno__openchat-nectar-0.8 | ---
pretty_name: Evaluation run of andysalerno/openchat-nectar-0.8
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [andysalerno/openchat-nectar-0.8](https://huggingface.co/andysalerno/openchat-nectar-0.8)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andysalerno__openchat-nectar-0.8\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T07:37:25.188045](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.8/blob/main/results_2024-01-21T07-37-25.188045.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6539948201078623,\n\
\ \"acc_stderr\": 0.03186024696025248,\n \"acc_norm\": 0.6547535995818915,\n\
\ \"acc_norm_stderr\": 0.03251445540703094,\n \"mc1\": 0.3574051407588739,\n\
\ \"mc1_stderr\": 0.01677659967672941,\n \"mc2\": 0.5226230452646764,\n\
\ \"mc2_stderr\": 0.015325117203952783\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6237201365187713,\n \"acc_stderr\": 0.014157022555407158,\n\
\ \"acc_norm\": 0.6578498293515358,\n \"acc_norm_stderr\": 0.013864152159177275\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6347341167098187,\n\
\ \"acc_stderr\": 0.00480520579872457,\n \"acc_norm\": 0.8305118502290381,\n\
\ \"acc_norm_stderr\": 0.0037441574425365596\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.03514942551267438,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.03514942551267438\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7935483870967742,\n \"acc_stderr\": 0.02302589961718872,\n \"\
acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.02302589961718872\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033477,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033477\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.02938162072646507,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.02938162072646507\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.02983796238829194,\n \
\ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.02983796238829194\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944867,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944867\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.03050028317654585,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.03050028317654585\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n\
\ \"acc_stderr\": 0.013223928616741624,\n \"acc_norm\": 0.8365261813537676,\n\
\ \"acc_norm_stderr\": 0.013223928616741624\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7630057803468208,\n \"acc_stderr\": 0.02289408248992599,\n\
\ \"acc_norm\": 0.7630057803468208,\n \"acc_norm_stderr\": 0.02289408248992599\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25027932960893856,\n\
\ \"acc_stderr\": 0.01448750085285042,\n \"acc_norm\": 0.25027932960893856,\n\
\ \"acc_norm_stderr\": 0.01448750085285042\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.023683591837008557,\n\
\ \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.023683591837008557\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4921773142112125,\n\
\ \"acc_stderr\": 0.012768673076111898,\n \"acc_norm\": 0.4921773142112125,\n\
\ \"acc_norm_stderr\": 0.012768673076111898\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7279411764705882,\n \"acc_stderr\": 0.02703304115168146,\n\
\ \"acc_norm\": 0.7279411764705882,\n \"acc_norm_stderr\": 0.02703304115168146\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069446,\n \
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069446\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174937,\n\
\ \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174937\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3574051407588739,\n\
\ \"mc1_stderr\": 0.01677659967672941,\n \"mc2\": 0.5226230452646764,\n\
\ \"mc2_stderr\": 0.015325117203952783\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8161010260457774,\n \"acc_stderr\": 0.01088791601330589\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6770280515542078,\n \
\ \"acc_stderr\": 0.012880360794851806\n }\n}\n```"
repo_url: https://huggingface.co/andysalerno/openchat-nectar-0.8
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|arc:challenge|25_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|gsm8k|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hellaswag|10_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T07-37-25.188045.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T07-37-25.188045.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- '**/details_harness|winogrande|5_2024-01-21T07-37-25.188045.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T07-37-25.188045.parquet'
- config_name: results
data_files:
- split: 2024_01_21T07_37_25.188045
path:
- results_2024-01-21T07-37-25.188045.parquet
- split: latest
path:
- results_2024-01-21T07-37-25.188045.parquet
---
# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.8
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [andysalerno/openchat-nectar-0.8](https://huggingface.co/andysalerno/openchat-nectar-0.8) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_andysalerno__openchat-nectar-0.8",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T07:37:25.188045](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.8/blob/main/results_2024-01-21T07-37-25.188045.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6539948201078623,
"acc_stderr": 0.03186024696025248,
"acc_norm": 0.6547535995818915,
"acc_norm_stderr": 0.03251445540703094,
"mc1": 0.3574051407588739,
"mc1_stderr": 0.01677659967672941,
"mc2": 0.5226230452646764,
"mc2_stderr": 0.015325117203952783
},
"harness|arc:challenge|25": {
"acc": 0.6237201365187713,
"acc_stderr": 0.014157022555407158,
"acc_norm": 0.6578498293515358,
"acc_norm_stderr": 0.013864152159177275
},
"harness|hellaswag|10": {
"acc": 0.6347341167098187,
"acc_stderr": 0.00480520579872457,
"acc_norm": 0.8305118502290381,
"acc_norm_stderr": 0.0037441574425365596
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.03514942551267438,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.03514942551267438
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.02302589961718872,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.02302589961718872
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033477,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033477
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.02938162072646507,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.02938162072646507
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.02983796238829194,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.02983796238829194
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944867,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944867
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.03050028317654585,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.03050028317654585
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8365261813537676,
"acc_stderr": 0.013223928616741624,
"acc_norm": 0.8365261813537676,
"acc_norm_stderr": 0.013223928616741624
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.02289408248992599,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.02289408248992599
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25027932960893856,
"acc_stderr": 0.01448750085285042,
"acc_norm": 0.25027932960893856,
"acc_norm_stderr": 0.01448750085285042
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.023683591837008557,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.023683591837008557
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4921773142112125,
"acc_stderr": 0.012768673076111898,
"acc_norm": 0.4921773142112125,
"acc_norm_stderr": 0.012768673076111898
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7279411764705882,
"acc_stderr": 0.02703304115168146,
"acc_norm": 0.7279411764705882,
"acc_norm_stderr": 0.02703304115168146
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069446,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069446
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174937,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174937
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3574051407588739,
"mc1_stderr": 0.01677659967672941,
"mc2": 0.5226230452646764,
"mc2_stderr": 0.015325117203952783
},
"harness|winogrande|5": {
"acc": 0.8161010260457774,
"acc_stderr": 0.01088791601330589
},
"harness|gsm8k|5": {
"acc": 0.6770280515542078,
"acc_stderr": 0.012880360794851806
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
khalidalt/Arabic-Offensive_socialmedia | ---
dataset_info:
features:
- name: Id
dtype: int64
- name: Platform
dtype: string
- name: Comment
dtype: string
- name: Majority_Label
dtype: string
- name: Agreement
dtype: float64
- name: NumOfJudgementUsed
dtype: int64
- name: Total_Judgement
dtype: int64
- name: Vulgar:V/HateSpeech:HS/None:-
dtype: string
splits:
- name: train
num_bytes: 1184762
num_examples: 4000
download_size: 561174
dataset_size: 1184762
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- ar
--- |
johnearlesullivan/capitol | ---
task_categories:
- text-classification
- question-answering
- summarization
- text-generation
- feature-extraction
- translation
- text2text-generation
- token-classification
- conversational
language:
- en
pretty_name: capitol_case
size_categories:
- 10K<n<100K
--- |
autoevaluate/autoeval-staging-eval-project-29af5371-7254763 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- conll2003
eval_info:
task: entity_extraction
model: huggingface-course/bert-finetuned-ner
dataset_name: conll2003
dataset_config: conll2003
dataset_split: validation
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: huggingface-course/bert-finetuned-ner
* Dataset: conll2003
To run new evaluation jobs, visit Hugging Face's [automatic evaluation service](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@douwekiela](https://huggingface.co/douwekiela) for evaluating this model. |
Deojoandco/capstone_fromgpt_without_gold_v4 | ---
dataset_info:
features:
- name: dialog_id
dtype: int64
- name: dialogue
dtype: string
- name: summary
dtype: string
- name: gold_tags
dtype: string
- name: gpt_success
dtype: bool
- name: gpt_response
dtype: string
- name: gold_tags_tokens_count
dtype: int64
- name: GPT_TAGS_FOUND
dtype: bool
- name: gpt_output_tags
dtype: string
- name: gpt_output_tag_tokens_count
dtype: int64
- name: GPT_MI_FOUND
dtype: bool
- name: gpt_tags_token_count
dtype: int64
- name: gpt_tags
dtype: string
- name: tag_token_count_match
dtype: bool
splits:
- name: test
num_bytes: 20862
num_examples: 12
download_size: 22567
dataset_size: 20862
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "capstone_fromgpt_without_gold_v4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JonasWeinert/jus_sep | ---
language:
- en
pretty_name: jus--sep
--- |
open-llm-leaderboard/details_namirocks__tutor-model-13b-ep3 | ---
pretty_name: Evaluation run of namirocks/tutor-model-13b-ep3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [namirocks/tutor-model-13b-ep3](https://huggingface.co/namirocks/tutor-model-13b-ep3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_namirocks__tutor-model-13b-ep3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-23T14:22:10.523504](https://huggingface.co/datasets/open-llm-leaderboard/details_namirocks__tutor-model-13b-ep3/blob/main/results_2023-12-23T14-22-10.523504.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.565791735957054,\n\
\ \"acc_stderr\": 0.03351962241327427,\n \"acc_norm\": 0.5744059102733704,\n\
\ \"acc_norm_stderr\": 0.03427708379273797,\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5298607855616481,\n\
\ \"mc2_stderr\": 0.015266264009722644\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5298634812286689,\n \"acc_stderr\": 0.014585305840007105,\n\
\ \"acc_norm\": 0.5733788395904437,\n \"acc_norm_stderr\": 0.014453185592920293\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6179047998406691,\n\
\ \"acc_stderr\": 0.004849065962692133,\n \"acc_norm\": 0.8150766779525991,\n\
\ \"acc_norm_stderr\": 0.0038744190656586222\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.040179012759817494,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.040179012759817494\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955785,\n\
\ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955785\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.05021167315686781,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.05021167315686781\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3253968253968254,\n \"acc_stderr\": 0.024130158299762613,\n \"\
acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.024130158299762613\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n\
\ \"acc_stderr\": 0.026593084516572274,\n \"acc_norm\": 0.6774193548387096,\n\
\ \"acc_norm_stderr\": 0.026593084516572274\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.696969696969697,\n \"acc_stderr\": 0.032742879140268674,\n \"\
acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.032742879140268674\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.027807032360686088,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.027807032360686088\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.02506909438729653,\n \
\ \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.02506909438729653\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.03186608121408831,\n \
\ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.03186608121408831\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7522935779816514,\n \"acc_stderr\": 0.018508143602547832,\n \"\
acc_norm\": 0.7522935779816514,\n \"acc_norm_stderr\": 0.018508143602547832\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294635,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294635\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7352941176470589,\n \"acc_stderr\": 0.0309645179269234,\n \"acc_norm\"\
: 0.7352941176470589,\n \"acc_norm_stderr\": 0.0309645179269234\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \"\
acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724147,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724147\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7496807151979565,\n\
\ \"acc_stderr\": 0.01549108895149459,\n \"acc_norm\": 0.7496807151979565,\n\
\ \"acc_norm_stderr\": 0.01549108895149459\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.615606936416185,\n \"acc_stderr\": 0.026189666966272035,\n\
\ \"acc_norm\": 0.615606936416185,\n \"acc_norm_stderr\": 0.026189666966272035\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37206703910614525,\n\
\ \"acc_stderr\": 0.0161658475835633,\n \"acc_norm\": 0.37206703910614525,\n\
\ \"acc_norm_stderr\": 0.0161658475835633\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510468008,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510468008\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n\
\ \"acc_stderr\": 0.02741799670563099,\n \"acc_norm\": 0.6302250803858521,\n\
\ \"acc_norm_stderr\": 0.02741799670563099\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6358024691358025,\n \"acc_stderr\": 0.026774929899722324,\n\
\ \"acc_norm\": 0.6358024691358025,\n \"acc_norm_stderr\": 0.026774929899722324\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4326241134751773,\n \"acc_stderr\": 0.029555454236778855,\n \
\ \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.029555454236778855\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.424380704041721,\n\
\ \"acc_stderr\": 0.012623343757430017,\n \"acc_norm\": 0.424380704041721,\n\
\ \"acc_norm_stderr\": 0.012623343757430017\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5514705882352942,\n \"acc_stderr\": 0.0302114796091216,\n\
\ \"acc_norm\": 0.5514705882352942,\n \"acc_norm_stderr\": 0.0302114796091216\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5571895424836601,\n \"acc_stderr\": 0.02009508315457734,\n \
\ \"acc_norm\": 0.5571895424836601,\n \"acc_norm_stderr\": 0.02009508315457734\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387634,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387634\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\
\ \"acc_stderr\": 0.03036049015401466,\n \"acc_norm\": 0.7562189054726368,\n\
\ \"acc_norm_stderr\": 0.03036049015401466\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5298607855616481,\n\
\ \"mc2_stderr\": 0.015266264009722644\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.012273648008759987\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12054586808188021,\n \
\ \"acc_stderr\": 0.008968608285309085\n }\n}\n```"
repo_url: https://huggingface.co/namirocks/tutor-model-13b-ep3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|arc:challenge|25_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|gsm8k|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hellaswag|10_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T14-22-10.523504.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-23T14-22-10.523504.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- '**/details_harness|winogrande|5_2023-12-23T14-22-10.523504.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-23T14-22-10.523504.parquet'
- config_name: results
data_files:
- split: 2023_12_23T14_22_10.523504
path:
- results_2023-12-23T14-22-10.523504.parquet
- split: latest
path:
- results_2023-12-23T14-22-10.523504.parquet
---
# Dataset Card for Evaluation run of namirocks/tutor-model-13b-ep3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [namirocks/tutor-model-13b-ep3](https://huggingface.co/namirocks/tutor-model-13b-ep3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_namirocks__tutor-model-13b-ep3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T14:22:10.523504](https://huggingface.co/datasets/open-llm-leaderboard/details_namirocks__tutor-model-13b-ep3/blob/main/results_2023-12-23T14-22-10.523504.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.565791735957054,
"acc_stderr": 0.03351962241327427,
"acc_norm": 0.5744059102733704,
"acc_norm_stderr": 0.03427708379273797,
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5298607855616481,
"mc2_stderr": 0.015266264009722644
},
"harness|arc:challenge|25": {
"acc": 0.5298634812286689,
"acc_stderr": 0.014585305840007105,
"acc_norm": 0.5733788395904437,
"acc_norm_stderr": 0.014453185592920293
},
"harness|hellaswag|10": {
"acc": 0.6179047998406691,
"acc_stderr": 0.004849065962692133,
"acc_norm": 0.8150766779525991,
"acc_norm_stderr": 0.0038744190656586222
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.040179012759817494,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.040179012759817494
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955785,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955785
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.05021167315686781,
"acc_norm": 0.48,
"acc_norm_stderr": 0.05021167315686781
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.024130158299762613,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.024130158299762613
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.026593084516572274,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.026593084516572274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.032742879140268674,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.032742879140268674
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.027807032360686088,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.027807032360686088
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5743589743589743,
"acc_stderr": 0.02506909438729653,
"acc_norm": 0.5743589743589743,
"acc_norm_stderr": 0.02506909438729653
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.03186608121408831,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.03186608121408831
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7522935779816514,
"acc_stderr": 0.018508143602547832,
"acc_norm": 0.7522935779816514,
"acc_norm_stderr": 0.018508143602547832
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294635,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294635
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.0309645179269234,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.0309645179269234
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724147,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724147
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7496807151979565,
"acc_stderr": 0.01549108895149459,
"acc_norm": 0.7496807151979565,
"acc_norm_stderr": 0.01549108895149459
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.615606936416185,
"acc_stderr": 0.026189666966272035,
"acc_norm": 0.615606936416185,
"acc_norm_stderr": 0.026189666966272035
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37206703910614525,
"acc_stderr": 0.0161658475835633,
"acc_norm": 0.37206703910614525,
"acc_norm_stderr": 0.0161658475835633
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510468008,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510468008
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.02741799670563099,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.02741799670563099
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6358024691358025,
"acc_stderr": 0.026774929899722324,
"acc_norm": 0.6358024691358025,
"acc_norm_stderr": 0.026774929899722324
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.029555454236778855,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.029555454236778855
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.424380704041721,
"acc_stderr": 0.012623343757430017,
"acc_norm": 0.424380704041721,
"acc_norm_stderr": 0.012623343757430017
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5514705882352942,
"acc_stderr": 0.0302114796091216,
"acc_norm": 0.5514705882352942,
"acc_norm_stderr": 0.0302114796091216
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5571895424836601,
"acc_stderr": 0.02009508315457734,
"acc_norm": 0.5571895424836601,
"acc_norm_stderr": 0.02009508315457734
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387634,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387634
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.03036049015401466,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.03036049015401466
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5298607855616481,
"mc2_stderr": 0.015266264009722644
},
"harness|winogrande|5": {
"acc": 0.7434885556432518,
"acc_stderr": 0.012273648008759987
},
"harness|gsm8k|5": {
"acc": 0.12054586808188021,
"acc_stderr": 0.008968608285309085
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
qwedsacf/cortex | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 1930496224
num_examples: 856586
download_size: 1005825499
dataset_size: 1930496224
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ShenaoZhang/0.001_idpo_dataset | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_chosen
dtype: float64
- name: score_rejected
dtype: float64
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: is_better
dtype: bool
splits:
- name: test_prefs_1
num_bytes: 13913254
num_examples: 2000
- name: train_prefs_1
num_bytes: 142551462
num_examples: 20378
- name: test_prefs_2
num_bytes: 14121124
num_examples: 2000
- name: train_prefs_2
num_bytes: 145719523
num_examples: 20378
download_size: 175825786
dataset_size: 316305363
configs:
- config_name: default
data_files:
- split: test_prefs_1
path: data/test_prefs_1-*
- split: train_prefs_1
path: data/train_prefs_1-*
- split: test_prefs_2
path: data/test_prefs_2-*
- split: train_prefs_2
path: data/train_prefs_2-*
---
# Dataset Card for "0.001_idpo_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adamo1139/toxic-dpo-natural-v2 | ---
license: other
license_name: other
license_link: LICENSE
---
|
maywell/SFT00 | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 17462081
num_examples: 7218
download_size: 8602029
dataset_size: 17462081
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Harshithacj123/CCU_Midterm | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 41353
num_examples: 50
download_size: 23370
dataset_size: 41353
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "CCU_Midterm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FreedomIntelligence/alpaca-gpt4-japanese | ---
license: apache-2.0
---
The dataset is used in the research related to [MultilingualSIFT](https://github.com/FreedomIntelligence/MultilingualSIFT). |
Tippawan/test2-data-semi-p3 | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
- name: prob
sequence: float64
- name: ifpass
sequence: int64
- name: pred
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 12517124
num_examples: 2890
download_size: 1228320
dataset_size: 12517124
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
WillHeld/wmt19-valid-only-de_en | ---
dataset_info:
features:
- name: translation
dtype:
translation:
languages:
- de
- en
splits:
- name: validation
num_bytes: 757649
num_examples: 2998
download_size: 491141
dataset_size: 757649
---
# Dataset Card for "wmt19-valid-only-de_en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nos1de/qemu_linux_wireshark_ffmpeg_2020_vulnfuncs | ---
dataset_info:
features:
- name: function
dtype: string
- name: labels
dtype: int64
splits:
- name: train
num_bytes: 89878238
num_examples: 20022
download_size: 23993098
dataset_size: 89878238
---
# Dataset Card for "qemu_linux_wireshark_ffmpeg_2020_vulnfuncs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shikii2/revii | ---
license: openrail
---
|
pixparse/pdfa-eng-wds |
---
license: other
license_name: pdfa-eng-wds
license_link: LICENSE
task_categories:
- image-to-text
size_categories:
- 10M<n<100M
language:
- en
splits:
- name: train
num_examples: 2159432
---
# Dataset Card for PDF Association dataset (PDFA)
## Dataset Description
- **Point of Contact from curators:** [Peter Wyatt, PDF Association CTO](mailto:peter.wyatt@pdfa.org)
- **Point of Contact Hugging Face:** [Pablo Montalvo](mailto:pablo@huggingface.co)
### Dataset Summary
PDFA dataset is a document dataset filtered from the SafeDocs corpus, aka CC-MAIN-2021-31-PDF-UNTRUNCATED. The original purpose of that corpus is for comprehensive pdf documents analysis. The purpose of that subset differs in that regard, as focus has been done on making the dataset machine learning-ready for vision-language models.
<center>
<img src="https://huggingface.co/datasets/pixparse/pdfa-eng-wds/resolve/main/doc_images/Nexsen_pruet.png" alt="A brochure with visible bounding boxes for lines and words" width="600" height="300">
<p><em>An example page of one pdf document, with added bounding boxes around words (red), lines (blue) and embedded images (green). </em></p>
</center>
This instance of PDFA is in [webdataset](https://github.com/webdataset/webdataset/) .tar format and can be used with derived forms of the `webdataset` library.
### Usage with `chug`
Check out [chug](https://github.com/huggingface/chug), our optimized library for sharded dataset loading!
```python
import chug
task_cfg = chug.DataTaskDocReadCfg(
page_sampling='all',
)
data_cfg = chug.DataCfg(
source='pixparse/pdfa-eng-wds',
split='train',
batch_size=None,
format='hfids',
num_workers=0,
)
data_loader = chug.create_loader(
data_cfg,
task_cfg,
)
sample = next(iter(data_loader))
```
### Usage with `datasets`
This dataset can also be used with webdataset library or current releases of Hugging Face datasets. Here is an example using the "streaming" parameter. We do recommend downloading the dataset to save bandwidth.
```python
dataset = load_dataset('pixparse/pdfa-eng-wds', streaming=True)
print(next(iter(dataset['train'])).keys())
>> dict_keys(['__key__', '__url__', 'json', 'ocr', 'pdf', 'tif'])
```
For faster download, you can use directly the `huggingface_hub` library. Make sure `hf_transfer` is installed prior to downloading and mind that you have enough space locally.
```python
import os
os.environ["HF_HUB_ENABLE_HF_TRANSFER"] = "1"
from huggingface_hub import HfApi, logging
#logging.set_verbosity_debug()
hf = HfApi()
hf.snapshot_download("pixparse/pdfa-eng-wds", repo_type="dataset", local_dir_use_symlinks=False)
```
On a normal setting, the 1.5TB can be downloaded in approximately 4 hours.
Further, a metadata file `_pdfa-english-train-info-minimal.json` contains the list of samples per shard, with same basename and `.json` or `.pdf` extension,
as well as the count of files per shard.
#### Words and lines document metadata
Initially, we started from the readily available ~11TB zip files from PDFA in their initial [data release](https://digitalcorpora.org/corpora/file-corpora/cc-main-2021-31-pdf-untruncated/).
From the pdf digital files, we extracted words, bounding boxes and image bounding boxes that are available in the pdf file. This information is then reshaped into lines organized in reading order, under the key `lines`. We keep non-reshaped word and bounding box information under the `word` key, should users want to use their own heuristic.
The way we obtain an approximate reading order is simply by looking at the frequency peaks of the leftmost word x-coordinate. A frequency peak means that a high number of lines are starting from the same point. Then, we keep track of the x-coordinate of each such identified column. If no peaks are found, the document is assumed to be readable in plain format.
The code to detect columns can be found here.
```python
def get_columnar_separators(page, min_prominence=0.3, num_bins=10, kernel_width=1):
"""
Identifies the x-coordinates that best separate columns by analyzing the derivative of a histogram
of the 'left' values (xmin) of bounding boxes.
Args:
page (dict): Page data with 'bbox' containing bounding boxes of words.
min_prominence (float): The required prominence of peaks in the histogram.
num_bins (int): Number of bins to use for the histogram.
kernel_width (int): The width of the Gaussian kernel used for smoothing the histogram.
Returns:
separators (list): The x-coordinates that separate the columns, if any.
"""
try:
left_values = [b[0] for b in page['bbox']]
hist, bin_edges = np.histogram(left_values, bins=num_bins)
hist = scipy.ndimage.gaussian_filter1d(hist, kernel_width)
min_val = min(hist)
hist = np.insert(hist, [0, len(hist)], min_val)
bin_width = bin_edges[1] - bin_edges[0]
bin_edges = np.insert(bin_edges, [0, len(bin_edges)], [bin_edges[0] - bin_width, bin_edges[-1] + bin_width])
peaks, _ = scipy.signal.find_peaks(hist, prominence=min_prominence * np.max(hist))
derivatives = np.diff(hist)
separators = []
if len(peaks) > 1:
# This finds the index of the maximum derivative value between peaks
# which indicates peaks after trough --> column
for i in range(len(peaks)-1):
peak_left = peaks[i]
peak_right = peaks[i+1]
max_deriv_index = np.argmax(derivatives[peak_left:peak_right]) + peak_left
separator_x = bin_edges[max_deriv_index + 1]
separators.append(separator_x)
except Exception as e:
separators = []
return separators
```
<center>
<img src="https://huggingface.co/datasets/pixparse/pdfa-eng-wds/resolve/main/doc_images/columnar_detection.png" alt="A graph of leftmost x positions in a 2-columns document" width="600" height="300">
<p><em>A graph of leftmost x-positions of bounding boxes on a 2-column (arxiv) document. Peaks are visibly detected. </em></p>
</center>
For each pdf document, we store statistics on the file size, number of words (as characters separated by spaces), number of pages, as well as the rendering times of each page for a given dpi.
#### Filtering process
File size and page rendering time are used to set thresholds in the final dataset: the goal is to remove files that are larger than 100 MB, or that take more than 500ms to render on a modern machine, to optimize dataloading at scale. Having "too large" or "too slow" files would add a burden to large-scale training pipelines and we choose to alleviate this in the current release. Finally, a full pass over the dataset is done, trying to open and decode a bytestream from each raw object and discarding any object (pair pdf/json) that fails to be opened, to remove corrupted data.
As a last step, we use XLM-Roberta to restrict the dataset to an english subset, specifically `papluca/xlm-roberta-base-language-detection` , on the first 512 words of the first page of each document.
Be aware that some documents may have several languages embedded in them, or that some predictions might be inaccurate. A majority of documents from the original corpus are in English language.
<center>
<img src="https://huggingface.co/datasets/pixparse/pdfa-english-train/resolve/main/doc_images/languages_pdfa_xlmroberta.png" alt="A histogram of languages count in the PDFA dataset." width="600" height="300">
<p><em>A histogram of language distribution taken on a fraction of the original -non-filtered on language- PDFA dataset. </em></p>
</center>
At the end, each document exists as a pairing of a pdf and a json file containing extensive OCR annotation as well as metadata information about rendering times. The filterings and packaging in
webdataset format are tailored towards multimodal machine learning at scale, specifically image-to-text tasks.
### Data, metadata and statistics.
Pdf files are coming from various sources. They are in RGB format, and contain multiple pages, and they can be rendered using the engine of your choice, here [pdf2image](https://github.com/Belval/pdf2image) .
```python
from pdf2image import convert_from_bytes
pdf_first_page = convert_from_bytes(sample['pdf'], dpi=300, first_page=1, last_page=1)[0]
```
<center>
<img src="https://huggingface.co/datasets/pixparse/pdfa-eng-wds/resolve/main/doc_images/pdf_first_page.png" alt="Rendering of an image for a Grade 8 lesson plan" width="400" height="600">
</center>
The metadata for each document has been formatted in this way. Each `pdf` is paired with a `json` file with the following structure. Entries have been shortened for readability.
```json
{
"pages": [
{
"words": [
{
"text": [
"Health", "Smart", "Virginia", "Sample", "Lesson", "Plan", "Grade", "8", "-", "HP-7"
],
"bbox": [
[0.117647, 0.045563, 0.051981, 0.015573],
[0.174694, 0.045563, 0.047954, 0.015573],
[0.227643, 0.045563, 0.05983, 0.015573],
[0.292539, 0.045563, 0.061002, 0.015573],
[0.357839, 0.045563, 0.058053, 0.015573],
[0.420399, 0.045563, 0.035908, 0.015573],
[0.716544, 0.04577, 0.054624, 0.016927],
[0.776681, 0.04577, 0.010905, 0.016927],
[0.793087, 0.04577, 0.00653, 0.016927],
[0.805078, 0.04577, 0.044768, 0.016927]
],
"score": [
1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0
],
"line_pos": [
[0, 0], [0, 8], [0, 16], [0, 24], [0, 32], [0, 40], [0, 48], [1, 0], [2, 0], [3, 0]
]
}
],
"lines": [
{
"text": [
"Health Smart Virginia Sample Lesson Plan Grade", "Physical", "Disease", "Health", "2020", "Grade 8 Sample Lesson Plan:"
],
"bbox": [
[0.117647, 0.045563, 0.653521, 0.016927],
[0.716546, 0.063952, 0.07323199999999996, 0.016927],
[0.716546, 0.082134, 0.07102200000000003, 0.016927],
[0.716546, 0.100315, 0.05683300000000002, 0.016927],
[0.716546, 0.118497, 0.043709, 0.016927],
[0.27, 0.201185, 0.459554, 0.028268]
],
"score": [
1.0, 1.0, 1.0, 1.0, 1.0, 1.0
],
"word_slice": [
[0, 7], [7, 8], [8, 9], [9, 10], [10, 11], [11, 16]
]
}
],
"images_bbox": [
[0.37353, 0.090907, 0.253736, 0.100189]
],
"images_bbox_no_text_overlap": [
[0.37353, 0.090907, 0.253736, 0.100189]
]
}
]
}
```
The top-level key, `pages`, is a list of every page in the document. The above example shows only one page.
`words` is a list of words without spaces, with their individual associated bounding box in the next entry.
`bbox` contains the bounding box coordinates in `left, top, width, height` format, with coordinates relative to the page size.
`line_pos`, for words, is a list of tuples indicating the index of the line the word belongs to, then the starting position in that line, character-wise.
`lines` are lines (parts of sequences, strings separated by spaces) grouped together using the heuristic detailed above.
`bbox` contains the bounding box coordinates in `left, top, width, height` format, with coordinates relative to the page size.
For each page,
`images_bbox` gives the bounding boxes of the images embedded in the page.
`images_bbox_no_text_overlap` gives a reduced list of bounding boxes that have no overlap with text found in the pdf. Text might be present as a drawing or another representation, however.
``
`score` is a placeholder of value 1.0 for the entire dataset.
Such a formatting follows the multimodal dataset from the Industry Document Library, https://huggingface.co/datasets/pixparse/idl-wds.
Estimating the number of tokens is done using a `LlamaTokenizer` from `tokenizers`. There is a clear power law distribution with respect to data length.
<center>
<img src="https://huggingface.co/datasets/pixparse/pdfa-eng-wds/resolve/main/doc_images/token_count_distribution.png" alt="A histogram of token count distribution per page" width="600" height="300">
<p><em>A histogram of token count distribution per page, taken from a subset of the dataset. There is a visible power law. </em></p>
</center>
### Data Splits
#### Train
* `pdfa-eng-wds-{0000..1799}.tar`
* Downloaded on 2024/01/22
* 1800 shards (approx 1200 docs/shard)
* 2,159,432 samples
* 18M pages
* 9.7 billion tokens (around 5 billion words)
## Additional Information
### Dataset Curators
Pablo Montalvo, Ross Wightman
### Disclaimer and note to researchers
This dataset is intended as an OCR-heavy pretraining basis for vision-language models. As a corpus, it does not represent the intent and purpose from CC-MAIN-2021-31-PDF-UNTRUNCATED. The original is made to represent extant pdf data in its diversity and complexity. In particular, common issues related to misuse of pdfs such as mojibake (garbled text due to decoding erros) are yet to be addressed systematically, and this dataset present simplifications that can hide such issues found in the wild. In order to address these biases, we recommend to examine carefully both the simplified annotation and the original `pdf` data, beyond a simple rendering.
Further, the annotation is limited to what can be extracted and is readily available - text drawn in images and only present as a bitmap rendition might be missed entirely by said annotation.
Finally, the restriction to English language is made to alleviate difficulties related to multilingual processing so that the community can be familiarized with this optimized multimodal format. A later release will be done on the full PDFA, with splits per languages, layout types, and so on.
### Licensing Information
Data has been filtered from the original corpus. As a consequence, users should note [Common Crawl's license and terms of use](https://commoncrawl.org/terms-of-use) and the [Digital Corpora project's Terms of Use](https://digitalcorpora.org/about-digitalcorpora/terms-of-use/).
|
CyberHarem/elise_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of elise/エリーゼ (Fire Emblem)
This is the dataset of elise/エリーゼ (Fire Emblem), containing 335 images and their tags.
The core tags of this character are `blonde_hair, long_hair, twintails, purple_eyes, bow, hair_bow, ribbon, drill_hair, purple_hair, hair_ribbon, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 335 | 313.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elise_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 335 | 213.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elise_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 676 | 400.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elise_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 335 | 288.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elise_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 676 | 505.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elise_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/elise_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 45 |  |  |  |  |  | 1girl, solo, gloves, smile, dress, open_mouth, staff, thighhighs, armor, thigh_boots |
| 1 | 11 |  |  |  |  |  | 1girl, black_bow, open_mouth, bangs, white_background, white_rose, simple_background, solo, :d, black_dress, blush, looking_at_viewer, very_long_hair, black_gloves, earrings, two-tone_hair, upper_body, black_capelet, holding_staff, long_sleeves, pink_bowtie |
| 2 | 5 |  |  |  |  |  | 1girl, ninja, official_alternate_costume, open_mouth, red_scarf, bangs, looking_at_viewer, obi, :d, bare_shoulders, black_gloves, blush, fingerless_gloves, multicolored_hair, solo, 2girls, earrings, holding, scroll, shuriken, sidelocks, simple_background, sleeveless_kimono, upper_body, very_long_hair |
| 3 | 7 |  |  |  |  |  | 1girl, navel, solo, blush, nipples, medium_breasts, open_mouth, completely_nude, looking_at_viewer, one_eye_closed, smile |
| 4 | 8 |  |  |  |  |  | 1girl, hetero, nipples, solo_focus, 1boy, completely_nude, open_mouth, pussy, sex, navel, penis, blush, girl_on_top, small_breasts, smile, spread_legs, vaginal, cowgirl_position, medium_breasts, uncensored |
| 5 | 6 |  |  |  |  |  | 1boy, 1girl, hetero, sex, solo_focus, vaginal, cum_in_pussy, gloves, nipples, penis, open_mouth, small_breasts, thighhighs, blush, mosaic_censoring |
| 6 | 7 |  |  |  |  |  | open_mouth, smile, black_bow, blush, multicolored_hair, flower_necklace, small_breasts, very_long_hair, 1girl, 2girls, bikini, black_one-piece_swimsuit, blue_sky, casual_one-piece_swimsuit, closed_eyes, cloud, day, outdoors, water |
| 7 | 6 |  |  |  |  |  | 1girl, bondage, solo, arms_behind_back, gagged, rope, shibari, improvised_gag, thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | gloves | smile | dress | open_mouth | staff | thighhighs | armor | thigh_boots | black_bow | bangs | white_background | white_rose | simple_background | :d | black_dress | blush | looking_at_viewer | very_long_hair | black_gloves | earrings | two-tone_hair | upper_body | black_capelet | holding_staff | long_sleeves | pink_bowtie | ninja | official_alternate_costume | red_scarf | obi | bare_shoulders | fingerless_gloves | multicolored_hair | 2girls | holding | scroll | shuriken | sidelocks | sleeveless_kimono | navel | nipples | medium_breasts | completely_nude | one_eye_closed | hetero | solo_focus | 1boy | pussy | sex | penis | girl_on_top | small_breasts | spread_legs | vaginal | cowgirl_position | uncensored | cum_in_pussy | mosaic_censoring | flower_necklace | bikini | black_one-piece_swimsuit | blue_sky | casual_one-piece_swimsuit | closed_eyes | cloud | day | outdoors | water | bondage | arms_behind_back | gagged | rope | shibari | improvised_gag |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------|:--------|:--------|:-------------|:--------|:-------------|:--------|:--------------|:------------|:--------|:-------------------|:-------------|:--------------------|:-----|:--------------|:--------|:--------------------|:-----------------|:---------------|:-----------|:----------------|:-------------|:----------------|:----------------|:---------------|:--------------|:--------|:-----------------------------|:------------|:------|:-----------------|:--------------------|:--------------------|:---------|:----------|:---------|:-----------|:------------|:--------------------|:--------|:----------|:-----------------|:------------------|:-----------------|:---------|:-------------|:-------|:--------|:------|:--------|:--------------|:----------------|:--------------|:----------|:-------------------|:-------------|:---------------|:-------------------|:------------------|:---------|:---------------------------|:-----------|:----------------------------|:--------------|:--------|:------|:-----------|:--------|:----------|:-------------------|:---------|:-------|:----------|:-----------------|
| 0 | 45 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | | | X | | | | | | X | | | X | X | | X | X | X | X | X | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | | X | | X | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | | | X | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | X | | | X | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | X | X | | X | X | | X | | X | | | X | X | | | | | | | | | | | | | | | | |
| 6 | 7 |  |  |  |  |  | X | | | X | | X | | | | | X | | | | | | | X | | X | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 7 | 6 |  |  |  |  |  | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X |
|
stephonX/test | ---
license: mit
---
|
chanelcolgate/tumorsbrain | ---
dataset_info:
features:
- name: image
dtype: image
- name: image_id
dtype: int64
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: float64
- name: bbox
sequence: float32
length: 4
- name: label
dtype:
class_label:
names:
'0': negative
'1': positive
- name: iscrowd
dtype: bool
splits:
- name: train
num_bytes: 11482275
num_examples: 893
- name: test
num_bytes: 2794404
num_examples: 223
download_size: 12628405
dataset_size: 14276679
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
aisuko/simple_english_wikipedia_p0 | ---
license: apache-2.0
language:
- en
---
Only for the researching usage.
## The converting process below.
```python
# Setting the env
os.environ['DATASET_URL']='http://sbert.net/datasets/simplewiki-2020-11-01.jsonl.gz'
os.environ['MODEL_NAME']='multi-qa-MiniLM-L6-cos-v1'
# Loading the dataset
import json
import gzip
from sentence_transformers.util import http_get
http_get(os.getenv('DATASET_URL'), os.getenv('DATASET_NAME'))
passages=[]
with gzip.open(os.getenv('DATASET_NAME'), 'rt', encoding='utf8') as fIn:
for line in fIn:
data=json.loads(line.strip())
# add all paragraphs
# passages.extend(data['paragraphs'])
# only add the first paragraph
passages.append(data['paragraphs'][0])
# for paragraph in data['paragraphs']:
# # We encode the passages as [title, text]
# passages.append([data['title'], paragraph])
len(passages)
# Loading the model
from sentence_transformers import SentenceTransformer
bi_encoder=SentenceTransformer(os.getenv('MODEL_NAME'))
bi_encoder.max_seq_length=256
bi_encoder.to('cuda')
bi_encoder
# normalizing the embeddings
from sentence_transformers.util import normalize_embeddings
corpus_embeddings=bi_encoder.encode(passages, convert_to_tensor=True, show_progress_bar=True).to('cuda')
corpus_embeddings=normalize_embeddings(corpus_embeddings)
len(corpus_embeddings)
# save to the csv file
import pandas as pd
embeddings_data=pd.DataFrame(corpus_embeddings.cpu())
embeddings_data.to_csv('simple_english_wikipedia.csv', index=False)
``` |
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test-mathemakitt-c50da3-1597456329 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test
eval_info:
task: text_zero_shot_classification
model: facebook/opt-125m
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test
dataset_config: mathemakitten--winobias_antistereotype_test
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-125m
* Dataset: mathemakitten/winobias_antistereotype_test
* Config: mathemakitten--winobias_antistereotype_test
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
hon9kon9ize/yue_school_math_0.25M | ---
license: gpl-3.0
---
# Cantonese School Math 0.25M
This dataset is Cantonese translation of the Simplified Chinese dataset [BelleGroup/school_math_0.25M](https://huggingface.co/datasets/BelleGroup/school_math_0.25M), please check the original dataset for more information.
This dataset is translated by [indiejoseph/bart-translation-zh-yue](https://huggingface.co/indiejoseph/bart-translation-zh-yue) and has not undergone any manual verification. The content may be inaccurate or misleading. please keep this in mind when using this dataset.
## Sample
```
{
"instruction": "題目:「「小華手入面有一個裝滿糖嘅袋,一共有12個,小明想知入面有幾粒糖,於是佢問小華:「你手入面嘅糖袋入面有幾粒糖?」」小華答:「有,而且多過10粒。」」請問小華手入面嘅糖袋入面最少有幾粒糖?",
"input": "",
"output": "由題目可知:小華手入面嘅糖袋入面有偶數個糖;\n又知道小華手入面嘅糖袋入面有多過10粒糖。\n因為糖分係偶數,多10粒,所以糖分最細一定係12。\n所以小華手入邊嘅糖袋最少有12粒糖。"
}
```
## Licensing Information
This dataset is provided under the same license as the original dataset: gpl-3.0
## Limitation and Usage Limits
Please check the original dataset for more information. |
gauss314/arg-equity | ---
license: apache-2.0
task_categories:
- tabular-classification
- tabular-regression
tags:
- Merval
- equity
pretty_name: Merval daily variations, for deep learning and machine learning tests
---
# Downloading the Options IV SP500 Dataset
This document will guide you through the steps to download the Merval equity dataset from Hugging Face Datasets.
To start, you'll need to install Hugging Face's `datasets` library if you haven't done so already.
You can do this using the following pip command:
```python
!pip install datasets
```
Here's the Python code to load the Merval equity dataset from Hugging Face Datasets and convert it into a pandas DataFrame:
```python
from datasets import load_dataset
import pandas as pd
id = "gauss314/arg-equity"
data = load_dataset(id)
df = pd.DataFrame(data['train'][:])
```
|
HuggingFaceM4/debug_MMMU_open_ended_to_remove | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: answers
sequence: string
- name: images
sequence: image
- name: question_type
dtype: string
- name: explanation
dtype: string
- name: topic_difficulty
dtype: string
- name: subfield
dtype: string
- name: img_type
dtype: string
splits:
- name: dev
num_bytes: 3457987.14
num_examples: 9
- name: validation
num_bytes: 20254614.737777777
num_examples: 53
- name: test
num_bytes: 192872094.82857144
num_examples: 627
download_size: 48031369
dataset_size: 216584696.70634922
---
# Dataset Card for "debug_MMMU_open_ended_to_remove"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ineoApp/factures_dataset_99 | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: bboxes
sequence:
sequence: int64
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': numero facture
'2': fournisseur
'3': date facture
'4': date limite
'5': montant ht
'6': montant ttc
'7': tva
'8': prix tva
'9': addresse
'10': reference
'11': art1 designation
'12': art1 quantite
'13': art1 prix unit
'14': art1 tva
'15': art1 montant ht
'16': art2 designation
'17': art2 quantite
'18': art2 prix unit
'19': art2 tva
'20': art2 montant ht
'21': art3 designation
'22': art3 quantite
'23': art3 prix unit
'24': art3 tva
'25': art3 montant ht
'26': art4 designation
'27': art4 quantite
'28': art4 prix unit
'29': art4 tva
'30': art4 montant ht
'31': art5 designation
'32': art5 quantite
'33': art5 prix unit
'34': art5 tva
'35': art5 montant ht
'36': art6 designation
'37': art6 quantite
'38': art6 prix unit
'39': art6 tva
'40': art6 montant ht
'41': art7 designation
'42': art7 quantite
'43': art7 prix unit
'44': art7 tva
'45': art7 montant ht
'46': art8 designation
'47': art8 quantite
'48': art8 prix unit
'49': art8 tva
'50': art8 montant ht
'51': art9 designation
'52': art9 quantite
'53': art9 prix unit
'54': art9 tva
'55': art9 montant ht
'56': art10 designation
'57': art10 quantite
'58': art10 prix unit
'59': art10 tva
'60': art10 montant ht
- name: tokens
sequence: string
splits:
- name: train
num_bytes: 29250169.548387095
num_examples: 24
- name: test
num_bytes: 8531299.451612903
num_examples: 7
download_size: 30317265
dataset_size: 37781469.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-75000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 992412
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AmelieSchreiber/ptm_500K | ---
license: mit
---
# Post Translational Modification 500K Dataset
This dataset was created from UniProt using
[this notebook](https://huggingface.co/datasets/AmelieSchreiber/ptm_500K/blob/main/ptm_data_preprocessing.ipynb). |
sameeksha2705/vit-fire-detection | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Fire
'1': Normal
'2': Smoke
splits:
- name: train
num_bytes: 155708396.35
num_examples: 7575
download_size: 891961525
dataset_size: 155708396.35
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
skar02/codegen | ---
dataset_info:
features:
- name: story
dtype: string
- name: code
dtype: string
splits:
- name: train
num_bytes: 6711
num_examples: 5
download_size: 12099
dataset_size: 6711
---
# Dataset Card for "codegen"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ihaflix1/vozourograve | ---
license: openrail
---
|
zirui3/cuad_instructions | ---
license: cc-by-4.0
---
|
haxxa/seo_urls | ---
license: llama2
language:
- en
--- |
cmcmaster/OpenHermes-2.5-Rheumatology | ---
dataset_info:
features:
- name: language
dtype: string
- name: topic
dtype: string
- name: hash
sequence: int64
- name: model
dtype: string
- name: source
dtype: string
- name: category
dtype: string
- name: avatarUrl
dtype: string
- name: model_name
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
- name: id
dtype: string
- name: title
dtype: string
- name: custom_instruction
dtype: bool
- name: system_prompt
dtype: string
- name: skip_prompt_formatting
dtype: bool
- name: views
dtype: int64
- name: idx
dtype: string
- name: text
dtype: string
- name: rheumatology_terms
dtype: bool
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 24801423
num_examples: 3058
download_size: 10071959
dataset_size: 24801423
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "OpenHermes-2.5-Rheumatology"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
guoyu-zhang/hh_llama2 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 179680583
num_examples: 124503
download_size: 106850234
dataset_size: 179680583
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Starkan/OCRT | ---
dataset_info:
features:
- name: image_input
sequence:
sequence:
sequence: float32
- name: text_input
sequence: int64
- name: _parse
struct:
- name: num_pages
dtype: int64
- name: page_indices
sequence: int64
- name: page_text
sequence:
sequence: string
- name: text_target
sequence: int64
splits:
- name: train
num_bytes: 19008551.25
num_examples: 6
- name: test
num_bytes: 6336183.75
num_examples: 2
download_size: 907924
dataset_size: 25344735.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
tasksource/lexcomp-nc-relation | ---
license: apache-2.0
language:
- en
---
https://github.com/vered1986/lexcomp/tree/master
```
@article{shwartz-dagan-2019-still,
title = "Still a Pain in the Neck: Evaluating Text Representations on Lexical Composition",
author = "Shwartz, Vered and
Dagan, Ido",
journal = "Transactions of the Association for Computational Linguistics",
volume = "7",
year = "2019",
address = "Cambridge, MA",
publisher = "MIT Press",
url = "https://aclanthology.org/Q19-1027",
doi = "10.1162/tacl_a_00277",
pages = "403--419",
abstract = "Building meaningful phrase representations is challenging because phrase meanings are not simply the sum of their constituent meanings. Lexical composition can shift the meanings of the constituent words and introduce implicit information. We tested a broad range of textual representations for their capacity to address these issues. We found that, as expected, contextualized word representations perform better than static word embeddings, more so on detecting meaning shift than in recovering implicit information, in which their performance is still far from that of humans. Our evaluation suite, consisting of six tasks related to lexical composition effects, can serve future research aiming to improve representations.",
}
``` |
open-llm-leaderboard/details_giraffe176__Open_Neural_Monarch_Maidv0.1 | ---
pretty_name: Evaluation run of giraffe176/Open_Neural_Monarch_Maidv0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [giraffe176/Open_Neural_Monarch_Maidv0.1](https://huggingface.co/giraffe176/Open_Neural_Monarch_Maidv0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_giraffe176__Open_Neural_Monarch_Maidv0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-01T03:24:19.468676](https://huggingface.co/datasets/open-llm-leaderboard/details_giraffe176__Open_Neural_Monarch_Maidv0.1/blob/main/results_2024-03-01T03-24-19.468676.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6522307491293496,\n\
\ \"acc_stderr\": 0.03204488687562026,\n \"acc_norm\": 0.6541663775575516,\n\
\ \"acc_norm_stderr\": 0.032685659755777245,\n \"mc1\": 0.386780905752754,\n\
\ \"mc1_stderr\": 0.017048857010515107,\n \"mc2\": 0.5638904892258573,\n\
\ \"mc2_stderr\": 0.015266476380236345\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6510238907849829,\n \"acc_stderr\": 0.013928933461382501,\n\
\ \"acc_norm\": 0.6766211604095563,\n \"acc_norm_stderr\": 0.013669421630012129\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6686914957179845,\n\
\ \"acc_stderr\": 0.004697217912463012,\n \"acc_norm\": 0.8593905596494722,\n\
\ \"acc_norm_stderr\": 0.0034690778470563956\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695255,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695255\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.022755204959542946,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.022755204959542946\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.02925290592725197,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.02925290592725197\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834846,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834846\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.02370309952525817,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.02370309952525817\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36312849162011174,\n\
\ \"acc_stderr\": 0.016083749986853694,\n \"acc_norm\": 0.36312849162011174,\n\
\ \"acc_norm_stderr\": 0.016083749986853694\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n\
\ \"acc_stderr\": 0.012741974333897226,\n \"acc_norm\": 0.4667535853976532,\n\
\ \"acc_norm_stderr\": 0.012741974333897226\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7022058823529411,\n \"acc_stderr\": 0.02777829870154544,\n\
\ \"acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.02777829870154544\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.02783302387139968,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.02783302387139968\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827072,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827072\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.386780905752754,\n\
\ \"mc1_stderr\": 0.017048857010515107,\n \"mc2\": 0.5638904892258573,\n\
\ \"mc2_stderr\": 0.015266476380236345\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7932123125493291,\n \"acc_stderr\": 0.011382566829235798\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6133434420015162,\n \
\ \"acc_stderr\": 0.013413955095965314\n }\n}\n```"
repo_url: https://huggingface.co/giraffe176/Open_Neural_Monarch_Maidv0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|arc:challenge|25_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|gsm8k|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hellaswag|10_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T03-24-19.468676.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T03-24-19.468676.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- '**/details_harness|winogrande|5_2024-03-01T03-24-19.468676.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-01T03-24-19.468676.parquet'
- config_name: results
data_files:
- split: 2024_03_01T03_24_19.468676
path:
- results_2024-03-01T03-24-19.468676.parquet
- split: latest
path:
- results_2024-03-01T03-24-19.468676.parquet
---
# Dataset Card for Evaluation run of giraffe176/Open_Neural_Monarch_Maidv0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [giraffe176/Open_Neural_Monarch_Maidv0.1](https://huggingface.co/giraffe176/Open_Neural_Monarch_Maidv0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_giraffe176__Open_Neural_Monarch_Maidv0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-01T03:24:19.468676](https://huggingface.co/datasets/open-llm-leaderboard/details_giraffe176__Open_Neural_Monarch_Maidv0.1/blob/main/results_2024-03-01T03-24-19.468676.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6522307491293496,
"acc_stderr": 0.03204488687562026,
"acc_norm": 0.6541663775575516,
"acc_norm_stderr": 0.032685659755777245,
"mc1": 0.386780905752754,
"mc1_stderr": 0.017048857010515107,
"mc2": 0.5638904892258573,
"mc2_stderr": 0.015266476380236345
},
"harness|arc:challenge|25": {
"acc": 0.6510238907849829,
"acc_stderr": 0.013928933461382501,
"acc_norm": 0.6766211604095563,
"acc_norm_stderr": 0.013669421630012129
},
"harness|hellaswag|10": {
"acc": 0.6686914957179845,
"acc_stderr": 0.004697217912463012,
"acc_norm": 0.8593905596494722,
"acc_norm_stderr": 0.0034690778470563956
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695255,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695255
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305527,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305527
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8,
"acc_stderr": 0.022755204959542946,
"acc_norm": 0.8,
"acc_norm_stderr": 0.022755204959542946
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586808,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586808
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.02925290592725197,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.02925290592725197
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834846,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834846
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.02370309952525817,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.02370309952525817
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36312849162011174,
"acc_stderr": 0.016083749986853694,
"acc_norm": 0.36312849162011174,
"acc_norm_stderr": 0.016083749986853694
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897226,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897226
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7022058823529411,
"acc_stderr": 0.02777829870154544,
"acc_norm": 0.7022058823529411,
"acc_norm_stderr": 0.02777829870154544
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069443,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.02783302387139968,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.02783302387139968
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827072,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827072
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.386780905752754,
"mc1_stderr": 0.017048857010515107,
"mc2": 0.5638904892258573,
"mc2_stderr": 0.015266476380236345
},
"harness|winogrande|5": {
"acc": 0.7932123125493291,
"acc_stderr": 0.011382566829235798
},
"harness|gsm8k|5": {
"acc": 0.6133434420015162,
"acc_stderr": 0.013413955095965314
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
yzhuang/autotree_automl_heloc_gosdt_l256_d3_sd0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: int64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: int64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 5845600000
num_examples: 100000
- name: validation
num_bytes: 584560000
num_examples: 10000
download_size: 746646741
dataset_size: 6430160000
---
# Dataset Card for "autotree_automl_heloc_gosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_201 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 974578248
num_examples: 191394
download_size: 994405182
dataset_size: 974578248
---
# Dataset Card for "chunk_201"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
damilojohn/Text-Descrambling | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
---
This dataset contains sentences and their scattered counterparts(with no semantic meaning) |
ANANDHU-SCT/TOPIC_CLASSIFICATION | ---
license: apache-2.0
---
|
stablediffusiontutorials/ComfyUI-IpAdapterV2-Nodes-Workflow | ---
license: cc-by-3.0
---
Download the required comfyUI Nodes IP adapter V2 (All the workflow are provided from officail repository)
|
recastai/LAION-art-EN-improved-captions | ---
license: cc-by-4.0
dataset_info:
features:
- name: orig_caption
dtype: string
- name: generated_caption
dtype: string
- name: key
dtype: string
- name: url
dtype: string
- name: index
dtype: int64
splits:
- name: train
num_bytes: 681710086
num_examples: 2684160
download_size: 441945582
dataset_size: 681710086
language:
- en
---
# Dataset Card for LAION-art-EN-improved-captions
### Dataset Summary
This dataset has been created by **Re:cast AI** for improving the semantic relationship of image-caption pairs. `generated_captions` were created in a semi-supervised fashion using the **Salesforce/blip2-flan-t5-xxl** model.
### Supported Tasks
Fine-tuning text-to-image generators (e.g. stable-diffusion), or a searchable prompt database (requires faiss-index).
## Dataset Structure
### Data Fields
- orig_caption
- generated_caption
- key
- index
- url
### Data Splits
- train
### Source Data
LAION-Art |
open-llm-leaderboard/details_haoranxu__ALMA-13B-R | ---
pretty_name: Evaluation run of haoranxu/ALMA-13B-R
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [haoranxu/ALMA-13B-R](https://huggingface.co/haoranxu/ALMA-13B-R) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_haoranxu__ALMA-13B-R\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-20T07:24:09.655926](https://huggingface.co/datasets/open-llm-leaderboard/details_haoranxu__ALMA-13B-R/blob/main/results_2024-01-20T07-24-09.655926.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4935056783282025,\n\
\ \"acc_stderr\": 0.03402030256976682,\n \"acc_norm\": 0.5012713038885745,\n\
\ \"acc_norm_stderr\": 0.03495235233822466,\n \"mc1\": 0.2521419828641371,\n\
\ \"mc1_stderr\": 0.015201522246299962,\n \"mc2\": 0.36085639370497274,\n\
\ \"mc2_stderr\": 0.013958476205822561\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5290102389078498,\n \"acc_stderr\": 0.014586776355294328,\n\
\ \"acc_norm\": 0.5554607508532423,\n \"acc_norm_stderr\": 0.01452122640562708\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5967934674367655,\n\
\ \"acc_stderr\": 0.004895390341445622,\n \"acc_norm\": 0.7944632543318064,\n\
\ \"acc_norm_stderr\": 0.00403267443344754\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.5481481481481482,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n\
\ \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5207547169811321,\n \"acc_stderr\": 0.030746349975723463,\n\
\ \"acc_norm\": 0.5207547169811321,\n \"acc_norm_stderr\": 0.030746349975723463\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5138888888888888,\n\
\ \"acc_stderr\": 0.041795966175810016,\n \"acc_norm\": 0.5138888888888888,\n\
\ \"acc_norm_stderr\": 0.041795966175810016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.4682080924855491,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364397,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364397\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400352,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400352\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.328042328042328,\n \"acc_stderr\": 0.024180497164376907,\n \"\
acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.024180497164376907\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.038932596106046734,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.038932596106046734\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5516129032258065,\n \"acc_stderr\": 0.028292056830112735,\n \"\
acc_norm\": 0.5516129032258065,\n \"acc_norm_stderr\": 0.028292056830112735\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3793103448275862,\n \"acc_stderr\": 0.03413963805906235,\n \"\
acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.03413963805906235\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.03793713171165635,\n\
\ \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.03793713171165635\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5808080808080808,\n \"acc_stderr\": 0.03515520728670417,\n \"\
acc_norm\": 0.5808080808080808,\n \"acc_norm_stderr\": 0.03515520728670417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.689119170984456,\n \"acc_stderr\": 0.033403619062765864,\n\
\ \"acc_norm\": 0.689119170984456,\n \"acc_norm_stderr\": 0.033403619062765864\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.44871794871794873,\n \"acc_stderr\": 0.025217315184846482,\n\
\ \"acc_norm\": 0.44871794871794873,\n \"acc_norm_stderr\": 0.025217315184846482\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507383,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507383\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.03242225027115006,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03242225027115006\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.03479185572599661,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.03479185572599661\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6385321100917432,\n \"acc_stderr\": 0.020598082009937378,\n \"\
acc_norm\": 0.6385321100917432,\n \"acc_norm_stderr\": 0.020598082009937378\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3472222222222222,\n \"acc_stderr\": 0.032468872436376486,\n \"\
acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.032468872436376486\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5784313725490197,\n \"acc_stderr\": 0.03465868196380762,\n \"\
acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.03465868196380762\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6286919831223629,\n \"acc_stderr\": 0.03145068600744859,\n \
\ \"acc_norm\": 0.6286919831223629,\n \"acc_norm_stderr\": 0.03145068600744859\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870254,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870254\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.04236964753041019,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.04236964753041019\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n\
\ \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n\
\ \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5705521472392638,\n \"acc_stderr\": 0.03889066619112722,\n\
\ \"acc_norm\": 0.5705521472392638,\n \"acc_norm_stderr\": 0.03889066619112722\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.04777615181156739,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.04777615181156739\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.717948717948718,\n\
\ \"acc_stderr\": 0.02948036054954119,\n \"acc_norm\": 0.717948717948718,\n\
\ \"acc_norm_stderr\": 0.02948036054954119\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6717752234993615,\n\
\ \"acc_stderr\": 0.016791685640192892,\n \"acc_norm\": 0.6717752234993615,\n\
\ \"acc_norm_stderr\": 0.016791685640192892\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5635838150289018,\n \"acc_stderr\": 0.026700545424943673,\n\
\ \"acc_norm\": 0.5635838150289018,\n \"acc_norm_stderr\": 0.026700545424943673\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3027932960893855,\n\
\ \"acc_stderr\": 0.015366860386397108,\n \"acc_norm\": 0.3027932960893855,\n\
\ \"acc_norm_stderr\": 0.015366860386397108\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5424836601307189,\n \"acc_stderr\": 0.02852638345214264,\n\
\ \"acc_norm\": 0.5424836601307189,\n \"acc_norm_stderr\": 0.02852638345214264\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5819935691318328,\n\
\ \"acc_stderr\": 0.028013651891995072,\n \"acc_norm\": 0.5819935691318328,\n\
\ \"acc_norm_stderr\": 0.028013651891995072\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5709876543209876,\n \"acc_stderr\": 0.027538925613470863,\n\
\ \"acc_norm\": 0.5709876543209876,\n \"acc_norm_stderr\": 0.027538925613470863\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36524822695035464,\n \"acc_stderr\": 0.028723863853281285,\n \
\ \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.028723863853281285\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3670143415906128,\n\
\ \"acc_stderr\": 0.012310264244842124,\n \"acc_norm\": 0.3670143415906128,\n\
\ \"acc_norm_stderr\": 0.012310264244842124\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4522058823529412,\n \"acc_stderr\": 0.030233758551596455,\n\
\ \"acc_norm\": 0.4522058823529412,\n \"acc_norm_stderr\": 0.030233758551596455\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.511437908496732,\n \"acc_stderr\": 0.020222541515610863,\n \
\ \"acc_norm\": 0.511437908496732,\n \"acc_norm_stderr\": 0.020222541515610863\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n\
\ \"acc_stderr\": 0.04738198703545483,\n \"acc_norm\": 0.5727272727272728,\n\
\ \"acc_norm_stderr\": 0.04738198703545483\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5224489795918368,\n \"acc_stderr\": 0.031976941187136725,\n\
\ \"acc_norm\": 0.5224489795918368,\n \"acc_norm_stderr\": 0.031976941187136725\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6567164179104478,\n\
\ \"acc_stderr\": 0.03357379665433431,\n \"acc_norm\": 0.6567164179104478,\n\
\ \"acc_norm_stderr\": 0.03357379665433431\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.03851597683718534,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.03851597683718534\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.03401052620104089,\n\
\ \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.03401052620104089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2521419828641371,\n\
\ \"mc1_stderr\": 0.015201522246299962,\n \"mc2\": 0.36085639370497274,\n\
\ \"mc2_stderr\": 0.013958476205822561\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7529597474348856,\n \"acc_stderr\": 0.01212140294285557\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/haoranxu/ALMA-13B-R
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|arc:challenge|25_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|gsm8k|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hellaswag|10_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T07-24-09.655926.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-20T07-24-09.655926.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- '**/details_harness|winogrande|5_2024-01-20T07-24-09.655926.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-20T07-24-09.655926.parquet'
- config_name: results
data_files:
- split: 2024_01_20T07_24_09.655926
path:
- results_2024-01-20T07-24-09.655926.parquet
- split: latest
path:
- results_2024-01-20T07-24-09.655926.parquet
---
# Dataset Card for Evaluation run of haoranxu/ALMA-13B-R
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [haoranxu/ALMA-13B-R](https://huggingface.co/haoranxu/ALMA-13B-R) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_haoranxu__ALMA-13B-R",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T07:24:09.655926](https://huggingface.co/datasets/open-llm-leaderboard/details_haoranxu__ALMA-13B-R/blob/main/results_2024-01-20T07-24-09.655926.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4935056783282025,
"acc_stderr": 0.03402030256976682,
"acc_norm": 0.5012713038885745,
"acc_norm_stderr": 0.03495235233822466,
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299962,
"mc2": 0.36085639370497274,
"mc2_stderr": 0.013958476205822561
},
"harness|arc:challenge|25": {
"acc": 0.5290102389078498,
"acc_stderr": 0.014586776355294328,
"acc_norm": 0.5554607508532423,
"acc_norm_stderr": 0.01452122640562708
},
"harness|hellaswag|10": {
"acc": 0.5967934674367655,
"acc_stderr": 0.004895390341445622,
"acc_norm": 0.7944632543318064,
"acc_norm_stderr": 0.00403267443344754
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5,
"acc_stderr": 0.04068942293855797,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04068942293855797
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5207547169811321,
"acc_stderr": 0.030746349975723463,
"acc_norm": 0.5207547169811321,
"acc_norm_stderr": 0.030746349975723463
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.041795966175810016,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.041795966175810016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364397,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364397
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400352,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400352
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.024180497164376907,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.024180497164376907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.038932596106046734,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.038932596106046734
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5516129032258065,
"acc_stderr": 0.028292056830112735,
"acc_norm": 0.5516129032258065,
"acc_norm_stderr": 0.028292056830112735
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.03413963805906235,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.03413963805906235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.03793713171165635,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.03793713171165635
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5808080808080808,
"acc_stderr": 0.03515520728670417,
"acc_norm": 0.5808080808080808,
"acc_norm_stderr": 0.03515520728670417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.689119170984456,
"acc_stderr": 0.033403619062765864,
"acc_norm": 0.689119170984456,
"acc_norm_stderr": 0.033403619062765864
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44871794871794873,
"acc_stderr": 0.025217315184846482,
"acc_norm": 0.44871794871794873,
"acc_norm_stderr": 0.025217315184846482
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507383,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507383
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03242225027115006,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03242225027115006
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.03479185572599661,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.03479185572599661
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6385321100917432,
"acc_stderr": 0.020598082009937378,
"acc_norm": 0.6385321100917432,
"acc_norm_stderr": 0.020598082009937378
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.03465868196380762,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.03465868196380762
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6286919831223629,
"acc_stderr": 0.03145068600744859,
"acc_norm": 0.6286919831223629,
"acc_norm_stderr": 0.03145068600744859
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.032190792004199956,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.032190792004199956
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.04236964753041019,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.04236964753041019
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5705521472392638,
"acc_stderr": 0.03889066619112722,
"acc_norm": 0.5705521472392638,
"acc_norm_stderr": 0.03889066619112722
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.04777615181156739,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.04777615181156739
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.717948717948718,
"acc_stderr": 0.02948036054954119,
"acc_norm": 0.717948717948718,
"acc_norm_stderr": 0.02948036054954119
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6717752234993615,
"acc_stderr": 0.016791685640192892,
"acc_norm": 0.6717752234993615,
"acc_norm_stderr": 0.016791685640192892
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5635838150289018,
"acc_stderr": 0.026700545424943673,
"acc_norm": 0.5635838150289018,
"acc_norm_stderr": 0.026700545424943673
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3027932960893855,
"acc_stderr": 0.015366860386397108,
"acc_norm": 0.3027932960893855,
"acc_norm_stderr": 0.015366860386397108
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5424836601307189,
"acc_stderr": 0.02852638345214264,
"acc_norm": 0.5424836601307189,
"acc_norm_stderr": 0.02852638345214264
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5819935691318328,
"acc_stderr": 0.028013651891995072,
"acc_norm": 0.5819935691318328,
"acc_norm_stderr": 0.028013651891995072
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5709876543209876,
"acc_stderr": 0.027538925613470863,
"acc_norm": 0.5709876543209876,
"acc_norm_stderr": 0.027538925613470863
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.028723863853281285,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.028723863853281285
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3670143415906128,
"acc_stderr": 0.012310264244842124,
"acc_norm": 0.3670143415906128,
"acc_norm_stderr": 0.012310264244842124
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4522058823529412,
"acc_stderr": 0.030233758551596455,
"acc_norm": 0.4522058823529412,
"acc_norm_stderr": 0.030233758551596455
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.511437908496732,
"acc_stderr": 0.020222541515610863,
"acc_norm": 0.511437908496732,
"acc_norm_stderr": 0.020222541515610863
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5727272727272728,
"acc_stderr": 0.04738198703545483,
"acc_norm": 0.5727272727272728,
"acc_norm_stderr": 0.04738198703545483
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5224489795918368,
"acc_stderr": 0.031976941187136725,
"acc_norm": 0.5224489795918368,
"acc_norm_stderr": 0.031976941187136725
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6567164179104478,
"acc_stderr": 0.03357379665433431,
"acc_norm": 0.6567164179104478,
"acc_norm_stderr": 0.03357379665433431
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.03851597683718534,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.03851597683718534
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7309941520467836,
"acc_stderr": 0.03401052620104089,
"acc_norm": 0.7309941520467836,
"acc_norm_stderr": 0.03401052620104089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299962,
"mc2": 0.36085639370497274,
"mc2_stderr": 0.013958476205822561
},
"harness|winogrande|5": {
"acc": 0.7529597474348856,
"acc_stderr": 0.01212140294285557
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Kevin-M-Smith/flint_images_300_300 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': clutter
'1': email
'2': email-squished
'3': handwritten-document
'4': spreadsheet
'5': typeset-document
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 176737820.0
num_examples: 4965
- name: test
num_bytes: 44473375.0
num_examples: 1242
download_size: 221048030
dataset_size: 221211195.0
---
# Dataset Card for "flint_images_300_300"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
7uk3y/articles | ---
license: mit
---
|
NomaDamas/eli5-document | ---
license: unknown
dataset_info:
features:
- name: id
dtype: string
- name: document
dtype: string
- name: doc_id
dtype: string
splits:
- name: train
num_bytes: 1567142371
num_examples: 262936
download_size: 968431765
dataset_size: 1567142371
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
openerotica/gorgon-lima-v0.1 | ---
license: apache-2.0
---
|
CyberHarem/azusa_mifuyu_puellamagimadokamagicasidestorymagiarecord | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Azusa Mifuyu
This is the dataset of Azusa Mifuyu, containing 109 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 109 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 260 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 109 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 109 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 109 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 109 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 109 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 260 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 260 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 260 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Boyuan07/5508nailset_diffusion.4.text_to_image | ---
dataset_info:
features:
- name: image
dtype: image
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 115302402.0
num_examples: 218
download_size: 0
dataset_size: 115302402.0
---
# Dataset Card for "5508nailset_diffusion.4.text_to_image"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/find_first_sent_train_50_eval_10 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 135430
num_examples: 110
- name: validation
num_bytes: 9027
num_examples: 10
download_size: 79296
dataset_size: 144457
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "find_first_sent_train_50_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ncardus/raptor | ---
dataset_info:
features:
- name: repo_id
dtype: string
- name: file_path
dtype: string
- name: content
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 35527776
num_examples: 3220
download_size: 12672129
dataset_size: 35527776
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
anti-ai/tydiqa-goldp-vietnamese | ---
license: mit
task_categories:
- text-retrieval
language:
- vi
size_categories:
- 1K<n<10K
configs:
- config_name: default
data_files:
- split: corpus
path: corpus/tydiqa_gold_corpus.json
- split: test
path: test/tydiqa_gold_test.jsonl
--- |
datasets-examples/doc-formats-csv-2 | ---
configs:
- config_name: default
data_files: "*.csv"
sep: ","
size_categories:
- n<1K
---
# [doc] formats - csv - 2
This dataset contains one csv file at the root:
- [data.csv](./data.csv)
```csv
kind,sound
dog,woof
cat,meow
pokemon,pika
human,hello
```
We define the separator as `","` in the YAML config, as well as the config name and the location of the file, with a glob expression:
```yaml
---
configs:
- config_name: default
data_files: "*.csv"
sep: ","
size_categories:
- n<1K
---
```
|
causal-lm/webchat | ---
language: en
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 56356014.70190586
num_examples: 17094
- name: validation
num_bytes: 6263977.298094135
num_examples: 1900
download_size: 37541970
dataset_size: 62619992.0
---
# Dataset Card for "webchat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-conll2003-conll2003-19d2d7-41085145069 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- conll2003
eval_info:
task: entity_extraction
model: Ce/bert-finetuned-ner
metrics: []
dataset_name: conll2003
dataset_config: conll2003
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: Ce/bert-finetuned-ner
* Dataset: conll2003
* Config: conll2003
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@ashwathjadhav23](https://huggingface.co/ashwathjadhav23) for evaluating this model. |
Aisha/BAAD16 | ---
annotations_creators:
- found
- crowdsourced
- expert-generated
language_creators:
- found
- crowdsourced
language:
- bn
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: 'BAAD16: Bangla Authorship Attribution Dataset (16 Authors)'
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- multi-class-classification
---
## Description
**BAAD16** is an **Authorship Attribution dataset for Bengali Literature**. It was collected and analyzed by the authors of [this paper](https://arxiv.org/abs/2001.05316). It was created by scraping text from an online Bangla e-library using custom web crawler and contains literary works of various famous Bangla writers. It contains novels, stories, series, and other works of 16 authors. Each sample document is created with 750 words. The dataset is imbalanced and resembles real-world scenarios more closely, where not all the authors will have a large number of sample texts. The following table gives more details about the dataset.
| Author Name | Number of Samples | Word Count | Unique Word
| --- | --- | --- | --- |
| zahir rayhan | 185 | 138k | 20k
|nazrul | 223 | 167k | 33k
|manik bandhopaddhay | 469 | 351k | 44k
|nihar ronjon gupta | 476 | 357k | 43k
|bongkim | 562 | 421k | 62k
|tarashonkor | 775 | 581k | 84k
|shottojit roy | 849 | 636k | 67k
|shordindu | 888 | 666k | 84k
|toslima nasrin | 931 | 698k | 76k
|shirshendu | 1048 | 786k | 69k
|zafar iqbal | 1100 | 825k | 53k
|robindronath | 1259 | 944k | 89k
|shorotchandra | 1312 | 984k | 78k
|shomresh | 1408 | 1056k|69k
|shunil gongopaddhay | 1963 | 1472k|109k
|humayun ahmed | 4518 | 3388k |161k
**Total**| 17,966|13,474,500 | 590,660
**Average**|1,122.875|842,156.25| 71,822.25
## Citation
If you use this dataset, please cite the paper [Authorship Attribution in Bangla literature using Character-level CNN](https://ieeexplore.ieee.org/abstract/document/9038560/). [Archive link](https://arxiv.org/abs/2001.05316).
```
@inproceedings{BAAD16Dataset,
title={Authorship Attribution in Bangla literature using Character-level CNN},
author={Khatun, Aisha and Rahman, Anisur and Islam, Md Saiful and others},
booktitle={2019 22nd International Conference on Computer and Information Technology (ICCIT)},
pages={1--5},
year={2019},
organization={IEEE}
doi={10.1109/ICCIT48885.2019.9038560}
}
```
This dataset is also available in Mendeley: [BAAD16 dataset](https://data.mendeley.com/datasets/6d9jrkgtvv/4). Always make sure to use the latest version of the dataset. Cite the dataset directly by:
```
@misc{BAAD6Dataset,
author = {Khatun, Aisha and Rahman, Anisur and Islam, Md. Saiful},
title = {BAAD16: Bangla Authorship Attribution Dataset},
year={2019},
doi = {10.17632/6d9jrkgtvv.4},
howpublished= {\url{https://data.mendeley.com/datasets/6d9jrkgtvv/4}}
}
``` |
ivelin/ui_refexp | ---
license: cc-by-4.0
task_categories:
- image-to-text
tags:
- ui-referring-expression
- ui-refexp
language:
- en
pretty_name: UI understanding
size_categories:
- 10K<n<100K
dataset_info:
features:
- name: screenshot
dtype: image
- name: prompt
dtype: string
- name: target_bounding_box
dtype: string
config_name: ui_refexp
splits:
- name: train
num_bytes: 562037265
num_examples: 15624
- name: validation
num_bytes: 60399225
num_examples: 471
- name: test
num_bytes: 69073969
num_examples: 565
download_size: 6515012176
dataset_size: 691510459
---
# Dataset Card for UIBert
## Dataset Description
- **Homepage:** https://github.com/google-research-datasets/uibert
- **Repository:** https://github.com/google-research-datasets/uibert
- **Paper:** https://arxiv.org/abs/2107.13731
- **Leaderboard:**
- UIBert: https://arxiv.org/abs/2107.13731
- Pix2Struct: https://arxiv.org/pdf/2210.03347
### Dataset Summary
This is a Hugging Face formatted dataset derived from the [Google UIBert dataset](https://github.com/google-research-datasets/uibert), which is in turn derived from the [RICO dataset](https://interactionmining.org/rico).
### Supported Tasks and Leaderboards
- UI Understanding
- UI Referring Expressions
- UI Action Automation
### Languages
- English
## Dataset Structure
- `screenshot`: blob of pixels.
- `prompt`: Prompt referring to a UI component with an optional action verb. For example "click on search button next to menu drawer."
- `target_bounding_box`: Bounding box of targeted UI components. `[xmin, ymin, xmax, ymax]`
### Data Splits
- train: 15K samples
- validation: 471 samples
- test: 565 samples
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
CATIE-AQ/fquad_fr_prompt_question_generation_with_answer | ---
language:
- fr
license:
- cc-by-nc-sa-3.0
size_categories:
- 100k<n<1M
task_categories:
- text-generation
tags:
- DFP
- french prompts
annotations_creators:
- found
language_creators:
- found
multilinguality:
- monolingual
source_datasets:
- fquad
---
# fquad_fr_prompt_question_generation_with_answer
## Summary
**fquad_fr_prompt_question_generation_with_answer** is a subset of the [**Dataset of French Prompts (DFP)**](https://huggingface.co/datasets/CATIE-AQ/DFP)).
It contains **526,218** rows that can be used for a question-generation (with answer) task.
The original data (without prompts) comes from the dataset [FQuAD]( https://huggingface.co/datasets/fquad) by d'Hoffschmidt et al. and was augmented by questions in SQUAD 2.0 format in the [FrenchQA]( https://huggingface.co/datasets/CATIE-AQ/frenchQA) dataset.
As FQuAD's license does not allow data to be shared, we simply share the prompts used, so that users can recreate the dataset themselves in the same format as the [xP3](https://huggingface.co/datasets/bigscience/xP3) dataset by Muennighoff et al.
## Prompts used
### List
22 prompts were created for this dataset. The logic applied consists in proposing prompts in the indicative tense, in the form of tutoiement and in the form of vouvoiement.
```
'Quelle question donnerait la réponse suivante ? Réponse : "'+answer+'";\nQuestion :',
'Déterminer la question qui aurait pu être posée pour obtenir la réponse suivante. \n Réponse : "'+answer+'";\n Question :',
'Détermine la question que tu aurais pu poser pour obtenir la réponse suivante. \n Réponse : "'+answer+'";\n Question :',
'Déterminez la question que vous auriez pu poser pour obtenir la réponse suivante . \n Réponse : "'+answer+'";\n Question :',
'Quelle question aurait pu être posée pour obtenir la réponse suivante. \n Réponse : "'+answer+'";\n Question :',
'Quelle question aurais-tu pu poser pour obtenir la réponse suivante. \n Réponse : "'+answer+'";\n Question :',
'Quelle question auriez-vous pu poser pour obtenir la réponse suivante. \n Réponse : "'+answer+'";\n Question :',
'Quelle question aurait pu être posée pour obtenir la réponse suivante. \n Réponse : "'+answer+'";\n Question :',
'Quelle question aurais-tu pu poser pour obtenir la réponse suivante. \n Réponse : "'+answer+'";\n Question :',
'Quelle question auriez-vous pu poser pour obtenir la réponse suivante. \n Réponse : "'+answer+'";\n Question :',
'Sachant la réponse suivante : "'+answer+'"\n Générer une bonne question : ',
'Sachant la réponse suivante : "'+answer+'"\n Génère une bonne question : ',
'Sachant la réponse suivante : "'+answer+'"\n Générez une bonne question : ',
'Sachant la réponse suivante : "'+answer+'"\n Trouver une bonne question : ',
'Sachant la réponse suivante : "'+answer+'"\n Trouves une bonne question : ',
'Sachant la réponse suivante : "'+answer+'"\n Trouvez une bonne question : ',
'Sachant la réponse suivante : "'+answer+'"\n Créer une bonne question : ',
'Sachant la réponse suivante : "'+answer+'"\n Crée trouver une bonne question : ',
'Sachant la réponse suivante : "'+answer+'"\n Créez trouver une bonne question : ',
'Sachant la réponse suivante : "'+answer+'"\n Ecrire une bonne question : ',
'Sachant la réponse suivante : "'+answer+'"\n Ecris une bonne question : ',
'Sachant la réponse suivante : "'+answer+'"\n Ecrivez une bonne question
```
# Splits
- `train` with 456,082 samples
- `valid` with 70,136 samples
- no test split
# How to use?
This repository doesn't contain any data.
# Citation
## Original data
> @ARTICLE{2020arXiv200206071
author = {Martin, d'Hoffschmidt and Maxime, Vidal and Wacim, Belblidia and Tom, Brendlé},
title = "{FQuAD: French Question Answering Dataset}",
journal = {arXiv e-prints},
keywords = {Computer Science - Computation and Language},
year = "2020",
month = "Feb",
eid = {arXiv:2002.06071},
pages = {arXiv:2002.06071},
archivePrefix = {arXiv},
eprint = {2002.06071},
primaryClass = {cs.CL}
}
## This Dataset
> @misc {centre_aquitain_des_technologies_de_l'information_et_electroniques_2023,
author = { {Centre Aquitain des Technologies de l'Information et Electroniques} },
title = { DFP (Revision 1d24c09) },
year = 2023,
url = { https://huggingface.co/datasets/CATIE-AQ/DFP },
doi = { 10.57967/hf/1200 },
publisher = { Hugging Face }
}
## License
CC BY-NC-SA 3.0 |
freshpearYoon/v3_train_free_concat_45 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 3842744136
num_examples: 2500
download_size: 1820357494
dataset_size: 3842744136
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Michaelber123/mike | ---
license: artistic-2.0
---
|
open-llm-leaderboard/details_xaviviro__FLAMA-0.5-3B | ---
pretty_name: Evaluation run of xaviviro/FLAMA-0.5-3B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xaviviro/FLAMA-0.5-3B](https://huggingface.co/xaviviro/FLAMA-0.5-3B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xaviviro__FLAMA-0.5-3B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-04T14:14:38.357079](https://huggingface.co/datasets/open-llm-leaderboard/details_xaviviro__FLAMA-0.5-3B/blob/main/results_2024-01-04T14-14-38.357079.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2647131817157082,\n\
\ \"acc_stderr\": 0.031070487329395803,\n \"acc_norm\": 0.2664900529641518,\n\
\ \"acc_norm_stderr\": 0.03184837841149458,\n \"mc1\": 0.26805385556915545,\n\
\ \"mc1_stderr\": 0.015506204722834557,\n \"mc2\": 0.4111139604848093,\n\
\ \"mc2_stderr\": 0.014893834532788594\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3532423208191126,\n \"acc_stderr\": 0.013967822714840055,\n\
\ \"acc_norm\": 0.3796928327645051,\n \"acc_norm_stderr\": 0.014182119866974874\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4980083648675563,\n\
\ \"acc_stderr\": 0.004989741826250387,\n \"acc_norm\": 0.676458872734515,\n\
\ \"acc_norm_stderr\": 0.00466871068919241\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2740740740740741,\n\
\ \"acc_stderr\": 0.03853254836552003,\n \"acc_norm\": 0.2740740740740741,\n\
\ \"acc_norm_stderr\": 0.03853254836552003\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2565789473684211,\n \"acc_stderr\": 0.035541803680256896,\n\
\ \"acc_norm\": 0.2565789473684211,\n \"acc_norm_stderr\": 0.035541803680256896\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2830188679245283,\n \"acc_stderr\": 0.027724236492700904,\n\
\ \"acc_norm\": 0.2830188679245283,\n \"acc_norm_stderr\": 0.027724236492700904\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.03186209851641144,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.03186209851641144\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.22127659574468084,\n \"acc_stderr\": 0.027136349602424056,\n\
\ \"acc_norm\": 0.22127659574468084,\n \"acc_norm_stderr\": 0.027136349602424056\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.04096985139843673,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.04096985139843673\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.03600105692727771,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.03600105692727771\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2698412698412698,\n \"acc_stderr\": 0.02286083830923207,\n \"\
acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.02286083830923207\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.03619604524124251,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.03619604524124251\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.23870967741935484,\n \"acc_stderr\": 0.02425107126220884,\n \"\
acc_norm\": 0.23870967741935484,\n \"acc_norm_stderr\": 0.02425107126220884\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2660098522167488,\n \"acc_stderr\": 0.03108982600293753,\n \"\
acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.03108982600293753\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3434343434343434,\n \"acc_stderr\": 0.033832012232444426,\n \"\
acc_norm\": 0.3434343434343434,\n \"acc_norm_stderr\": 0.033832012232444426\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752954,\n\
\ \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752954\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.258974358974359,\n \"acc_stderr\": 0.02221110681006166,\n \
\ \"acc_norm\": 0.258974358974359,\n \"acc_norm_stderr\": 0.02221110681006166\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230182,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230182\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.19327731092436976,\n \"acc_stderr\": 0.025649470265889193,\n\
\ \"acc_norm\": 0.19327731092436976,\n \"acc_norm_stderr\": 0.025649470265889193\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23853211009174313,\n \"acc_stderr\": 0.01827257581023186,\n \"\
acc_norm\": 0.23853211009174313,\n \"acc_norm_stderr\": 0.01827257581023186\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.22685185185185186,\n \"acc_stderr\": 0.02856165010242227,\n \"\
acc_norm\": 0.22685185185185186,\n \"acc_norm_stderr\": 0.02856165010242227\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604243,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604243\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.21518987341772153,\n \"acc_stderr\": 0.026750826994676187,\n \
\ \"acc_norm\": 0.21518987341772153,\n \"acc_norm_stderr\": 0.026750826994676187\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.18834080717488788,\n\
\ \"acc_stderr\": 0.026241132996407256,\n \"acc_norm\": 0.18834080717488788,\n\
\ \"acc_norm_stderr\": 0.026241132996407256\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.20610687022900764,\n \"acc_stderr\": 0.035477710041594654,\n\
\ \"acc_norm\": 0.20610687022900764,\n \"acc_norm_stderr\": 0.035477710041594654\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3884297520661157,\n \"acc_stderr\": 0.04449270350068382,\n \"\
acc_norm\": 0.3884297520661157,\n \"acc_norm_stderr\": 0.04449270350068382\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.03559039531617342,\n\
\ \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.03559039531617342\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n\
\ \"acc_stderr\": 0.028911208802749465,\n \"acc_norm\": 0.26495726495726496,\n\
\ \"acc_norm_stderr\": 0.028911208802749465\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2950191570881226,\n\
\ \"acc_stderr\": 0.01630836377293272,\n \"acc_norm\": 0.2950191570881226,\n\
\ \"acc_norm_stderr\": 0.01630836377293272\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2774566473988439,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.2774566473988439,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25617283950617287,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.25617283950617287,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2801418439716312,\n \"acc_stderr\": 0.026789172351140235,\n \
\ \"acc_norm\": 0.2801418439716312,\n \"acc_norm_stderr\": 0.026789172351140235\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24445893089960888,\n\
\ \"acc_stderr\": 0.01097642501311389,\n \"acc_norm\": 0.24445893089960888,\n\
\ \"acc_norm_stderr\": 0.01097642501311389\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.02352924218519311,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.02352924218519311\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26633986928104575,\n \"acc_stderr\": 0.0178831881346672,\n \
\ \"acc_norm\": 0.26633986928104575,\n \"acc_norm_stderr\": 0.0178831881346672\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2545454545454545,\n\
\ \"acc_stderr\": 0.04172343038705383,\n \"acc_norm\": 0.2545454545454545,\n\
\ \"acc_norm_stderr\": 0.04172343038705383\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.028920583220675585,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.028920583220675585\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22885572139303484,\n\
\ \"acc_stderr\": 0.029705284056772432,\n \"acc_norm\": 0.22885572139303484,\n\
\ \"acc_norm_stderr\": 0.029705284056772432\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.034605799075530255,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.034605799075530255\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.27485380116959063,\n \"acc_stderr\": 0.034240429246915824,\n\
\ \"acc_norm\": 0.27485380116959063,\n \"acc_norm_stderr\": 0.034240429246915824\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26805385556915545,\n\
\ \"mc1_stderr\": 0.015506204722834557,\n \"mc2\": 0.4111139604848093,\n\
\ \"mc2_stderr\": 0.014893834532788594\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6211523283346487,\n \"acc_stderr\": 0.01363372460318033\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.008339651250947688,\n \
\ \"acc_stderr\": 0.0025049422268605234\n }\n}\n```"
repo_url: https://huggingface.co/xaviviro/FLAMA-0.5-3B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|arc:challenge|25_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|gsm8k|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hellaswag|10_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T14-14-38.357079.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T14-14-38.357079.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- '**/details_harness|winogrande|5_2024-01-04T14-14-38.357079.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-04T14-14-38.357079.parquet'
- config_name: results
data_files:
- split: 2024_01_04T14_14_38.357079
path:
- results_2024-01-04T14-14-38.357079.parquet
- split: latest
path:
- results_2024-01-04T14-14-38.357079.parquet
---
# Dataset Card for Evaluation run of xaviviro/FLAMA-0.5-3B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [xaviviro/FLAMA-0.5-3B](https://huggingface.co/xaviviro/FLAMA-0.5-3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xaviviro__FLAMA-0.5-3B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-04T14:14:38.357079](https://huggingface.co/datasets/open-llm-leaderboard/details_xaviviro__FLAMA-0.5-3B/blob/main/results_2024-01-04T14-14-38.357079.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2647131817157082,
"acc_stderr": 0.031070487329395803,
"acc_norm": 0.2664900529641518,
"acc_norm_stderr": 0.03184837841149458,
"mc1": 0.26805385556915545,
"mc1_stderr": 0.015506204722834557,
"mc2": 0.4111139604848093,
"mc2_stderr": 0.014893834532788594
},
"harness|arc:challenge|25": {
"acc": 0.3532423208191126,
"acc_stderr": 0.013967822714840055,
"acc_norm": 0.3796928327645051,
"acc_norm_stderr": 0.014182119866974874
},
"harness|hellaswag|10": {
"acc": 0.4980083648675563,
"acc_stderr": 0.004989741826250387,
"acc_norm": 0.676458872734515,
"acc_norm_stderr": 0.00466871068919241
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.03853254836552003,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.03853254836552003
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2565789473684211,
"acc_stderr": 0.035541803680256896,
"acc_norm": 0.2565789473684211,
"acc_norm_stderr": 0.035541803680256896
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2830188679245283,
"acc_stderr": 0.027724236492700904,
"acc_norm": 0.2830188679245283,
"acc_norm_stderr": 0.027724236492700904
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641144,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641144
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.22127659574468084,
"acc_stderr": 0.027136349602424056,
"acc_norm": 0.22127659574468084,
"acc_norm_stderr": 0.027136349602424056
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843673,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843673
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.02286083830923207,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.02286083830923207
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.03619604524124251,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.03619604524124251
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23870967741935484,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.23870967741935484,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.03108982600293753,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.03108982600293753
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3434343434343434,
"acc_stderr": 0.033832012232444426,
"acc_norm": 0.3434343434343434,
"acc_norm_stderr": 0.033832012232444426
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752954,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752954
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.258974358974359,
"acc_stderr": 0.02221110681006166,
"acc_norm": 0.258974358974359,
"acc_norm_stderr": 0.02221110681006166
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230182,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230182
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.19327731092436976,
"acc_stderr": 0.025649470265889193,
"acc_norm": 0.19327731092436976,
"acc_norm_stderr": 0.025649470265889193
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23853211009174313,
"acc_stderr": 0.01827257581023186,
"acc_norm": 0.23853211009174313,
"acc_norm_stderr": 0.01827257581023186
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.22685185185185186,
"acc_stderr": 0.02856165010242227,
"acc_norm": 0.22685185185185186,
"acc_norm_stderr": 0.02856165010242227
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604243,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604243
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.21518987341772153,
"acc_stderr": 0.026750826994676187,
"acc_norm": 0.21518987341772153,
"acc_norm_stderr": 0.026750826994676187
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.18834080717488788,
"acc_stderr": 0.026241132996407256,
"acc_norm": 0.18834080717488788,
"acc_norm_stderr": 0.026241132996407256
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.20610687022900764,
"acc_stderr": 0.035477710041594654,
"acc_norm": 0.20610687022900764,
"acc_norm_stderr": 0.035477710041594654
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3884297520661157,
"acc_stderr": 0.04449270350068382,
"acc_norm": 0.3884297520661157,
"acc_norm_stderr": 0.04449270350068382
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2883435582822086,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.2883435582822086,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.028911208802749465,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.028911208802749465
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2950191570881226,
"acc_stderr": 0.01630836377293272,
"acc_norm": 0.2950191570881226,
"acc_norm_stderr": 0.01630836377293272
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2774566473988439,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.2774566473988439,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25617283950617287,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.25617283950617287,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2801418439716312,
"acc_stderr": 0.026789172351140235,
"acc_norm": 0.2801418439716312,
"acc_norm_stderr": 0.026789172351140235
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24445893089960888,
"acc_stderr": 0.01097642501311389,
"acc_norm": 0.24445893089960888,
"acc_norm_stderr": 0.01097642501311389
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.02352924218519311,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.02352924218519311
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26633986928104575,
"acc_stderr": 0.0178831881346672,
"acc_norm": 0.26633986928104575,
"acc_norm_stderr": 0.0178831881346672
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.04172343038705383,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.04172343038705383
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.028920583220675585,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.028920583220675585
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22885572139303484,
"acc_stderr": 0.029705284056772432,
"acc_norm": 0.22885572139303484,
"acc_norm_stderr": 0.029705284056772432
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.034605799075530255,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.034605799075530255
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.27485380116959063,
"acc_stderr": 0.034240429246915824,
"acc_norm": 0.27485380116959063,
"acc_norm_stderr": 0.034240429246915824
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26805385556915545,
"mc1_stderr": 0.015506204722834557,
"mc2": 0.4111139604848093,
"mc2_stderr": 0.014893834532788594
},
"harness|winogrande|5": {
"acc": 0.6211523283346487,
"acc_stderr": 0.01363372460318033
},
"harness|gsm8k|5": {
"acc": 0.008339651250947688,
"acc_stderr": 0.0025049422268605234
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.