datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Shoubhik8/instruct_data | ---
dataset_info:
features:
- name: instructions
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 316774393
num_examples: 320339
download_size: 11233992
dataset_size: 316774393
---
# Dataset Card for "instruct_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PingAndPasquale/results | ---
license: apache-2.0
---
|
ironchanchellor/MetalDam_NoBright_Augmented_Cropped | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 309419533.104
num_examples: 1088
- name: validation
num_bytes: 78805194.0
num_examples: 272
download_size: 390268940
dataset_size: 388224727.104
---
# Dataset Card for "MetalDam_NoBright_Augmented_Cropped"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sasha/prof_images_blip__stabilityai-stable-diffusion-2-1-base | ---
dataset_info:
features:
- name: images
dtype: image
- name: embeddings
sequence: float32
splits:
- name: courier
num_bytes: 3573421.0
num_examples: 100
- name: aide
num_bytes: 2817584.0
num_examples: 100
- name: police_officer
num_bytes: 3493332.0
num_examples: 100
- name: purchasing_agent
num_bytes: 3798921.0
num_examples: 100
- name: metal_worker
num_bytes: 5019792.0
num_examples: 100
- name: financial_analyst
num_bytes: 3511611.0
num_examples: 100
- name: stocker
num_bytes: 5028292.0
num_examples: 100
- name: it_specialist
num_bytes: 3657377.0
num_examples: 100
- name: writer
num_bytes: 3430382.0
num_examples: 100
- name: accountant
num_bytes: 3139473.0
num_examples: 100
- name: coach
num_bytes: 3510680.0
num_examples: 100
- name: painter
num_bytes: 3678749.0
num_examples: 100
- name: real_estate_broker
num_bytes: 3504506.0
num_examples: 100
- name: truck_driver
num_bytes: 4387732.0
num_examples: 100
- name: data_entry_keyer
num_bytes: 3834847.0
num_examples: 100
- name: computer_support_specialist
num_bytes: 3723003.0
num_examples: 100
- name: cook
num_bytes: 3331728.0
num_examples: 100
- name: interior_designer
num_bytes: 4207481.0
num_examples: 100
- name: nutritionist
num_bytes: 4060297.0
num_examples: 100
- name: designer
num_bytes: 4366492.0
num_examples: 100
- name: maid
num_bytes: 3025701.0
num_examples: 100
- name: producer
num_bytes: 3735016.0
num_examples: 100
- name: executive_assistant
num_bytes: 3310359.0
num_examples: 100
- name: logistician
num_bytes: 3736991.0
num_examples: 100
- name: tractor_operator
num_bytes: 5755587.0
num_examples: 100
- name: doctor
num_bytes: 3104182.0
num_examples: 100
- name: inventory_clerk
num_bytes: 4532647.0
num_examples: 100
- name: sheet_metal_worker
num_bytes: 4657901.0
num_examples: 100
- name: groundskeeper
num_bytes: 5153242.0
num_examples: 100
- name: electrical_engineer
num_bytes: 5537436.0
num_examples: 100
- name: physical_therapist
num_bytes: 3490827.0
num_examples: 100
- name: insurance_agent
num_bytes: 3297070.0
num_examples: 100
- name: aerospace_engineer
num_bytes: 4497032.0
num_examples: 100
- name: psychologist
num_bytes: 3395399.0
num_examples: 100
- name: financial_advisor
num_bytes: 3122531.0
num_examples: 100
- name: printing_press_operator
num_bytes: 5048137.0
num_examples: 100
- name: architect
num_bytes: 3212333.0
num_examples: 100
- name: dental_hygienist
num_bytes: 3253594.0
num_examples: 100
- name: artist
num_bytes: 3209914.0
num_examples: 100
- name: office_worker
num_bytes: 3342331.0
num_examples: 100
- name: ceo
num_bytes: 3163362.0
num_examples: 100
- name: taxi_driver
num_bytes: 4380564.0
num_examples: 100
- name: librarian
num_bytes: 4803359.0
num_examples: 100
- name: author
num_bytes: 3321969.0
num_examples: 100
- name: plumber
num_bytes: 4157248.0
num_examples: 100
- name: construction_worker
num_bytes: 3919398.0
num_examples: 100
- name: clergy
num_bytes: 3244854.0
num_examples: 100
- name: electrician
num_bytes: 4721187.0
num_examples: 100
- name: jailer
num_bytes: 3792187.0
num_examples: 100
- name: credit_counselor
num_bytes: 3333189.0
num_examples: 100
- name: scientist
num_bytes: 3128838.0
num_examples: 100
- name: drywall_installer
num_bytes: 3259586.0
num_examples: 100
- name: school_bus_driver
num_bytes: 4694012.0
num_examples: 100
- name: dental_assistant
num_bytes: 3224238.0
num_examples: 100
- name: fitness_instructor
num_bytes: 3743598.0
num_examples: 100
- name: detective
num_bytes: 3207867.0
num_examples: 100
- name: hairdresser
num_bytes: 3781112.0
num_examples: 100
- name: welder
num_bytes: 5358221.0
num_examples: 100
- name: pharmacy_technician
num_bytes: 4220593.0
num_examples: 100
- name: compliance_officer
num_bytes: 3231700.0
num_examples: 100
- name: singer
num_bytes: 3377655.0
num_examples: 100
- name: tutor
num_bytes: 3031846.0
num_examples: 100
- name: language_pathologist
num_bytes: 4037466.0
num_examples: 100
- name: medical_records_specialist
num_bytes: 3968675.0
num_examples: 100
- name: sales_manager
num_bytes: 3600033.0
num_examples: 100
- name: industrial_engineer
num_bytes: 4411912.0
num_examples: 100
- name: manager
num_bytes: 3386375.0
num_examples: 100
- name: mechanic
num_bytes: 4630389.0
num_examples: 100
- name: postal_worker
num_bytes: 3435732.0
num_examples: 100
- name: computer_systems_analyst
num_bytes: 4242610.0
num_examples: 100
- name: salesperson
num_bytes: 3611873.0
num_examples: 100
- name: office_clerk
num_bytes: 3118961.0
num_examples: 100
- name: claims_appraiser
num_bytes: 3493777.0
num_examples: 100
- name: security_guard
num_bytes: 3882558.0
num_examples: 100
- name: interviewer
num_bytes: 3103601.0
num_examples: 100
- name: dispatcher
num_bytes: 3729661.0
num_examples: 100
- name: lawyer
num_bytes: 3105483.0
num_examples: 100
- name: marketing_manager
num_bytes: 3500502.0
num_examples: 100
- name: customer_service_representative
num_bytes: 3294831.0
num_examples: 100
- name: software_developer
num_bytes: 3445707.0
num_examples: 100
- name: mover
num_bytes: 3762882.0
num_examples: 100
- name: supervisor
num_bytes: 3271366.0
num_examples: 100
- name: paralegal
num_bytes: 3452166.0
num_examples: 100
- name: graphic_designer
num_bytes: 4463452.0
num_examples: 100
- name: dentist
num_bytes: 3195882.0
num_examples: 100
- name: roofer
num_bytes: 4594395.0
num_examples: 100
- name: public_relations_specialist
num_bytes: 3346098.0
num_examples: 100
- name: engineer
num_bytes: 3401592.0
num_examples: 100
- name: occupational_therapist
num_bytes: 3308346.0
num_examples: 100
- name: manicurist
num_bytes: 3493207.0
num_examples: 100
- name: cleaner
num_bytes: 3581148.0
num_examples: 100
- name: facilities_manager
num_bytes: 3693224.0
num_examples: 100
- name: repair_worker
num_bytes: 4433569.0
num_examples: 100
- name: cashier
num_bytes: 4698208.0
num_examples: 100
- name: baker
num_bytes: 3984604.0
num_examples: 100
- name: market_research_analyst
num_bytes: 3972330.0
num_examples: 100
- name: health_technician
num_bytes: 3225689.0
num_examples: 100
- name: veterinarian
num_bytes: 3598065.0
num_examples: 100
- name: underwriter
num_bytes: 3052303.0
num_examples: 100
- name: mechanical_engineer
num_bytes: 5204285.0
num_examples: 100
- name: janitor
num_bytes: 3901667.0
num_examples: 100
- name: pilot
num_bytes: 3748614.0
num_examples: 100
- name: therapist
num_bytes: 3031952.0
num_examples: 100
- name: director
num_bytes: 3248609.0
num_examples: 100
- name: wholesale_buyer
num_bytes: 5076103.0
num_examples: 100
- name: air_conditioning_installer
num_bytes: 4488325.0
num_examples: 100
- name: butcher
num_bytes: 4898530.0
num_examples: 100
- name: machinery_mechanic
num_bytes: 5016939.0
num_examples: 100
- name: event_planner
num_bytes: 3813150.0
num_examples: 100
- name: carpet_installer
num_bytes: 4798926.0
num_examples: 100
- name: musician
num_bytes: 3502127.0
num_examples: 100
- name: civil_engineer
num_bytes: 3787249.0
num_examples: 100
- name: farmer
num_bytes: 4691952.0
num_examples: 100
- name: financial_manager
num_bytes: 3396723.0
num_examples: 100
- name: childcare_worker
num_bytes: 3470828.0
num_examples: 100
- name: clerk
num_bytes: 2903767.0
num_examples: 100
- name: machinist
num_bytes: 5270759.0
num_examples: 100
- name: firefighter
num_bytes: 4434213.0
num_examples: 100
- name: photographer
num_bytes: 3188794.0
num_examples: 100
- name: file_clerk
num_bytes: 4124484.0
num_examples: 100
- name: bus_driver
num_bytes: 4492167.0
num_examples: 100
- name: fast_food_worker
num_bytes: 3669214.0
num_examples: 100
- name: bartender
num_bytes: 5229770.0
num_examples: 100
- name: computer_programmer
num_bytes: 3739287.0
num_examples: 100
- name: pharmacist
num_bytes: 4371308.0
num_examples: 100
- name: nursing_assistant
num_bytes: 2939794.0
num_examples: 100
- name: career_counselor
num_bytes: 3351086.0
num_examples: 100
- name: mental_health_counselor
num_bytes: 3602446.0
num_examples: 100
- name: network_administrator
num_bytes: 4825552.0
num_examples: 100
- name: teacher
num_bytes: 2749312.0
num_examples: 100
- name: dishwasher
num_bytes: 5028185.0
num_examples: 100
- name: teller
num_bytes: 3251253.0
num_examples: 100
- name: teaching_assistant
num_bytes: 3557402.0
num_examples: 100
- name: payroll_clerk
num_bytes: 3845179.0
num_examples: 100
- name: laboratory_technician
num_bytes: 3757958.0
num_examples: 100
- name: social_assistant
num_bytes: 3564678.0
num_examples: 100
- name: radiologic_technician
num_bytes: 3885685.0
num_examples: 100
- name: social_worker
num_bytes: 3242952.0
num_examples: 100
- name: nurse
num_bytes: 2554856.0
num_examples: 100
- name: receptionist
num_bytes: 3445701.0
num_examples: 100
- name: carpenter
num_bytes: 4584283.0
num_examples: 100
- name: correctional_officer
num_bytes: 3829211.0
num_examples: 100
- name: community_manager
num_bytes: 3796040.0
num_examples: 100
- name: massage_therapist
num_bytes: 3187773.0
num_examples: 100
- name: head_cook
num_bytes: 3407926.0
num_examples: 100
- name: plane_mechanic
num_bytes: 4632703.0
num_examples: 100
download_size: 582528766
dataset_size: 558658902.0
---
# Dataset Card for "prof_images_blip__stabilityai-stable-diffusion-2-1-base"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/higokumaru_honkai3 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of higokumaru (Houkai 3rd)
This is the dataset of higokumaru (Houkai 3rd), containing 74 images and their tags.
The core tags of this character are `pink_hair, animal_ears, fox_ears, hair_between_eyes, bangs, long_hair, blue_eyes, hair_ornament, tail, fox_tail, multicolored_hair, fox_girl`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 74 | 104.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/higokumaru_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 74 | 54.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/higokumaru_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 162 | 112.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/higokumaru_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 74 | 91.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/higokumaru_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 162 | 172.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/higokumaru_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/higokumaru_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, looking_at_viewer, solo, japanese_clothes, open_mouth, streaked_hair, white_background, :d, detached_sleeves, ponytail, simple_background, black_shorts, full_body, bare_shoulders, rope |
| 1 | 13 |  |  |  |  |  | closed_mouth, 1girl, bare_shoulders, solo, breasts, looking_at_viewer, purple_eyes, smile, white_thighhighs, katana, petals, pink_skirt, full_body, miko, sheath, white_sleeves, dress, holding_sword |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | japanese_clothes | open_mouth | streaked_hair | white_background | :d | detached_sleeves | ponytail | simple_background | black_shorts | full_body | bare_shoulders | rope | closed_mouth | breasts | purple_eyes | smile | white_thighhighs | katana | petals | pink_skirt | miko | sheath | white_sleeves | dress | holding_sword |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-------------------|:-------------|:----------------|:-------------------|:-----|:-------------------|:-----------|:--------------------|:---------------|:------------|:-----------------|:-------|:---------------|:----------|:--------------|:--------|:-------------------|:---------|:---------|:-------------|:-------|:---------|:----------------|:--------|:----------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | X | X | | | | | | | | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
mriosqu/landing_pages_02_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 64770191.0
num_examples: 89
download_size: 63485786
dataset_size: 64770191.0
---
# Dataset Card for "landing_pages_02_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibivibiv/alpaca_tiny10 | ---
dataset_info:
features:
- name: output
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 459989497
num_examples: 290901
download_size: 266195720
dataset_size: 459989497
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
freshpearYoon/vr_train_free_60 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: filename
dtype: string
- name: NumOfUtterance
dtype: int64
- name: text
dtype: string
- name: samplingrate
dtype: int64
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: speaker_id
dtype: string
- name: directory
dtype: string
splits:
- name: train
num_bytes: 6018593270
num_examples: 10000
download_size: 1042461209
dataset_size: 6018593270
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_154 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1167068120.0
num_examples: 227410
download_size: 1194180162
dataset_size: 1167068120.0
---
# Dataset Card for "chunk_154"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Lora54/Audio_Editing | ---
license: mit
---
|
ambrosfitz/mcq_data_1 | ---
license: cc-by-3.0
---
|
theblackcat102/multiround-programming-convo | ---
task_categories:
- text-generation
language:
- en
tags:
- data-science
- programming
- statistic
pretty_name: Multi-Round Programming Conversations
size_categories:
- 100K<n<1M
---
# Multi-Round Programming Conversations
Based on previous evol-codealpaca-v1 dataset with added sampled questions from stackoverflow, crossvalidated and make it multiround!
It should be more suited to train a code assistant which works side by side.
## Tasks included in here:
* Data science, statistic, programming questions
* Code translation : translate a short function from Python, Golang, C++, Java, Javascript
* Code fixing : Fix randomly corrupts characters with no tab spacing code.
|
CyberHarem/ling_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ling/リィン/令 (Arknights)
This is the dataset of ling/リィン/令 (Arknights), containing 500 images and their tags.
The core tags of this character are `blue_hair, long_hair, horns, dragon_horns, pointy_ears, very_long_hair, breasts, blue_eyes, earrings, dragon_girl, braid, multicolored_hair, large_breasts, tail, dragon_tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.16 GiB | [Download](https://huggingface.co/datasets/CyberHarem/ling_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 961.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ling_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1336 | 1.76 GiB | [Download](https://huggingface.co/datasets/CyberHarem/ling_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ling_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, black_shorts, long_sleeves, looking_at_viewer, open_coat, solo, white_coat, white_shirt, wide_sleeves, yellow_necktie, black_gloves, cowboy_shot, jewelry, parted_lips, smile, elbow_gloves, gourd, short_shorts, holding_cup, thigh_strap, navel, detached_collar |
| 1 | 16 |  |  |  |  |  | 1girl, black_shorts, jewelry, open_coat, short_shorts, solo, white_coat, white_footwear, white_shirt, yellow_necktie, boots, looking_at_viewer, full_body, lantern, long_sleeves, smile, wide_sleeves, thigh_strap, black_gloves, elbow_gloves, gourd, closed_mouth, holding_staff, simple_background |
| 2 | 6 |  |  |  |  |  | 1girl, black_shorts, cowboy_shot, detached_collar, holding, jewelry, long_sleeves, looking_at_viewer, short_shorts, smile, solo, bandeau, bare_shoulders, gloves, navel, off_shoulder, purple_eyes, thigh_strap, black_coat, medium_breasts, open_coat, staff, thighs, belt, gourd, parted_lips, ponytail, tube_top |
| 3 | 26 |  |  |  |  |  | 1girl, jewelry, solo, looking_at_viewer, white_dress, holding, official_alternate_costume, smile, bare_shoulders, wide_sleeves, detached_sleeves, streaked_hair, gloves, blue_skin, long_sleeves, sash, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_shorts | long_sleeves | looking_at_viewer | open_coat | solo | white_coat | white_shirt | wide_sleeves | yellow_necktie | black_gloves | cowboy_shot | jewelry | parted_lips | smile | elbow_gloves | gourd | short_shorts | holding_cup | thigh_strap | navel | detached_collar | white_footwear | boots | full_body | lantern | closed_mouth | holding_staff | simple_background | holding | bandeau | bare_shoulders | gloves | off_shoulder | purple_eyes | black_coat | medium_breasts | staff | thighs | belt | ponytail | tube_top | white_dress | official_alternate_costume | detached_sleeves | streaked_hair | blue_skin | sash | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:---------------|:--------------------|:------------|:-------|:-------------|:--------------|:---------------|:-----------------|:---------------|:--------------|:----------|:--------------|:--------|:---------------|:--------|:---------------|:--------------|:--------------|:--------|:------------------|:-----------------|:--------|:------------|:----------|:---------------|:----------------|:--------------------|:----------|:----------|:-----------------|:---------|:---------------|:--------------|:-------------|:-----------------|:--------|:---------|:-------|:-----------|:-----------|:--------------|:-----------------------------|:-------------------|:----------------|:------------|:-------|:-------------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | X | | X | X | X | X | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | X | X | | | | | | X | X | X | X | | X | X | | X | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 3 | 26 |  |  |  |  |  | X | | X | X | | X | | | X | | | | X | | X | | | | | | | | | | | | | | | X | | X | X | | | | | | | | | | X | X | X | X | X | X | X |
|
fun1021183/cvt2_GS3_3 | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 168383363.336
num_examples: 1258
- name: test
num_bytes: 303296888.792
num_examples: 2222
download_size: 471343711
dataset_size: 471680252.128
---
# Dataset Card for "cvt2_GS3_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
allenai/layout_distribution_shift | ---
license: apache-2.0
dataset_info:
features:
- name: words
sequence: string
- name: bbox
sequence:
sequence: float64
- name: labels
sequence: int64
- name: block_ids
sequence: int64
- name: line_ids
sequence: int64
- name: files
dtype: string
splits:
- name: remapped_Acta_dev.json
num_bytes: 9101699
num_examples: 491
- name: remapped_Acta_fewshot_finetune_10_pubs_dev_episode_0.json
num_bytes: 27958
num_examples: 2
- name: remapped_Acta_fewshot_finetune_10_pubs_dev_episode_1.json
num_bytes: 18241
num_examples: 2
- name: remapped_Acta_fewshot_finetune_10_pubs_dev_episode_2.json
num_bytes: 45036
num_examples: 2
- name: remapped_Acta_fewshot_finetune_10_pubs_train_episode_0.json
num_bytes: 2269140
num_examples: 117
- name: remapped_Acta_fewshot_finetune_10_pubs_train_episode_1.json
num_bytes: 2011417
num_examples: 102
- name: remapped_Acta_fewshot_finetune_10_pubs_train_episode_2.json
num_bytes: 2236354
num_examples: 116
- name: remapped_Acta_test.json
num_bytes: 9450719
num_examples: 495
- name: remapped_Acta_train.json
num_bytes: 71764609
num_examples: 3848
- name: remapped_BMC_dev.json
num_bytes: 23369323
num_examples: 503
- name: remapped_BMC_fewshot_finetune_10_pubs_dev_episode_0.json
num_bytes: 108560
num_examples: 2
- name: remapped_BMC_fewshot_finetune_10_pubs_dev_episode_1.json
num_bytes: 67630
num_examples: 2
- name: remapped_BMC_fewshot_finetune_10_pubs_dev_episode_2.json
num_bytes: 74671
num_examples: 2
- name: remapped_BMC_fewshot_finetune_10_pubs_train_episode_0.json
num_bytes: 3696565
num_examples: 82
- name: remapped_BMC_fewshot_finetune_10_pubs_train_episode_1.json
num_bytes: 3831159
num_examples: 77
- name: remapped_BMC_fewshot_finetune_10_pubs_train_episode_2.json
num_bytes: 4578916
num_examples: 96
- name: remapped_BMC_test.json
num_bytes: 25850198
num_examples: 535
- name: remapped_BMC_train.json
num_bytes: 216531051
num_examples: 4628
- name: remapped_PLoS_dev.json
num_bytes: 78334040
num_examples: 1499
- name: remapped_PLoS_fewshot_finetune_10_pubs_dev_episode_0.json
num_bytes: 93335
num_examples: 2
- name: remapped_PLoS_fewshot_finetune_10_pubs_dev_episode_1.json
num_bytes: 125366
num_examples: 2
- name: remapped_PLoS_fewshot_finetune_10_pubs_dev_episode_2.json
num_bytes: 126234
num_examples: 2
- name: remapped_PLoS_fewshot_finetune_10_pubs_train_episode_0.json
num_bytes: 6190119
num_examples: 120
- name: remapped_PLoS_fewshot_finetune_10_pubs_train_episode_1.json
num_bytes: 5238068
num_examples: 98
- name: remapped_PLoS_fewshot_finetune_10_pubs_train_episode_2.json
num_bytes: 5662127
num_examples: 121
- name: remapped_PLoS_test.json
num_bytes: 77843621
num_examples: 1480
- name: remapped_PLoS_train.json
num_bytes: 622303242
num_examples: 11937
- name: remapped_RU_dev.json
num_bytes: 37618273
num_examples: 689
- name: remapped_RU_fewshot_finetune_10_pubs_dev_episode_0.json
num_bytes: 140245
num_examples: 2
- name: remapped_RU_fewshot_finetune_10_pubs_dev_episode_1.json
num_bytes: 135845
num_examples: 2
- name: remapped_RU_fewshot_finetune_10_pubs_dev_episode_2.json
num_bytes: 153598
num_examples: 2
- name: remapped_RU_fewshot_finetune_10_pubs_train_episode_0.json
num_bytes: 6575257
num_examples: 116
- name: remapped_RU_fewshot_finetune_10_pubs_train_episode_1.json
num_bytes: 5998010
num_examples: 105
- name: remapped_RU_fewshot_finetune_10_pubs_train_episode_2.json
num_bytes: 5014176
num_examples: 99
- name: remapped_RU_test.json
num_bytes: 36500742
num_examples: 665
- name: remapped_RU_train.json
num_bytes: 297906664
num_examples: 5452
- name: remapped_diverse_publications_125_publishers_dev.json
num_bytes: 26129574
num_examples: 493
- name: remapped_diverse_publications_125_publishers_train.json
num_bytes: 628804969
num_examples: 13002
- name: remapped_diverse_publications_25_publishers_dev.json
num_bytes: 30070714
num_examples: 606
- name: remapped_diverse_publications_25_publishers_train.json
num_bytes: 675457461
num_examples: 13538
download_size: 442657892
dataset_size: 2921454926
---
|
LDJnr/LessWrong-Amplify-Instruct | ---
license: apache-2.0
task_categories:
- conversational
- question-answering
- text-generation
language:
- en
tags:
- Physics
- Biology
- Math
- Chemistry
- Culture
- Logic
pretty_name: LessWrong-Amplify-Instruct
size_categories:
- n<1K
---
## This is the Official LessWrong-Amplify-Instruct dataset. Over 500 multi-turn examples, and many more coming soon!
- This leverages Amplify-Instruct method to extend thousands of scraped Less-Wrong posts into advanced in-depth multi-turn conversations.
- Comprised of over 500 highly filtered multi-turn conversations between GPT-4 and real humans.
- Average context length per conversation is over 2,000 tokens. (will measure this more accurately soon)
- Synthetically created using a newly developed pipeline that leverages GPT-4 to dynamically role play and inquire as the human and assistant.
- Each conversation is optimized to amplify the raw knowledge retreival of the model and delve deep into obscure and advanced topics.
## Purpose?
- This dataset is not intended to be trained on by itself, however, the size and quality of this dataset can work wonderfully as a supplemmentary addition to virtually any multi-turn compatible dataset. I encourage this use, all I ask is proper credits given for such!
## Quality filtering and cleaning.
- Extensive cleaning was done to filter out instances of overt AI moralizing or related behaviour, such as "As an AI language model" and "September 2021"
## Credits
During the curation process, there can be some relatively arduos steps when it comes to actually executing on the best experimentation or concepts for how to filter examples out.
Luckily there is folks over at NousResearch that helped expedite this process with little to no sacrifices in quality, big thank you to J-Supha specifically for making these types of significant contributions.
## Future Plans & How you can help!
This is a relatively early build amongst the grand plans for the future of what I plan to work on!
In the near future we plan on leveraging the help of domain specific expert volunteers to eliminate any mathematically/verifiably incorrect answers from training curations of different types of datasets.
If you have at-least a bachelors in mathematics, physics, biology or chemistry and would like to volunteer even just 30 minutes of your expertise time, please contact LDJ on discord!
Citation:
```
@article{daniele2023amplify-instruct,
title={Amplify-Instruct: Synthetically Generated Diverse Multi-turn Conversations for Effecient LLM Training.},
author={Daniele, Luigi and Suphavadeeprasit},
journal={arXiv preprint arXiv:(comming soon)},
year={2023}
}
``` |
shidowake/FreedomIntelligence_alpaca-gpt4-japanese_subset_split_7 | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 4863217.322740098
num_examples: 4997
download_size: 2456445
dataset_size: 4863217.322740098
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kovakavics/comfyuicuccaim | ---
license: afl-3.0
---
|
tariktalhadinc/testdataset | ---
license: openrail
---
|
igorktech/anekdots | ---
language:
- ru
license: odc-by
size_categories:
- 100K<n<1M
task_categories:
- text-generation
pretty_name: Anekdots
tags:
- not-for-all-audiences
- roleplay
dataset_info:
features:
- name: total_mark
dtype: int64
- name: date
dtype: int64
- name: downvote
dtype: int64
- name: total_votes
dtype: string
- name: upvote
dtype: int64
- name: text
dtype: string
- name: hash
dtype: string
- name: alpha_frac
dtype: float64
- name: LDR
dtype: float64
- name: days_since_publication
dtype: int64
- name: time_decay
dtype: float64
- name: LDR_time_decay
dtype: float64
splits:
- name: train
num_bytes: 209320893
num_examples: 497596
download_size: 121676024
dataset_size: 209320893
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
---
# Anekdots Dataset Summary
This dataset comprises a collection of humorous anecdotes ("anecdotes") gathered from the period of January 4, 1996, to December 4, 2023. The dataset has undergone a thorough cleaning and preparation process to ensure its suitability for model training purposes. Researchers and developers can leverage this curated dataset for various applications, such as natural language processing and machine learning.
---
# Dataset License Summary
This dataset is released under the Open Data Commons Attribution License (ODC-BY). The licensor does not claim copyright on the content and encourages wide use and distribution.
## Disclaimer
The dataset's author explicitly disclaims any rights to the content and assumes no responsibility for its usage. The dataset may contain materials from [anekdot.ru](https://www.anekdot.ru/), and users are encouraged to refer to the website for additional context.
## Warning
The administration of [anekdot.ru](https://www.anekdot.ru/) disclaims responsibility for submitted content, potential legal violations, or offensive nature. Rights to published materials belong to their respective owners, and the website administration is not liable for third-party use. The administration reserves the right to use information at its discretion and may remove user-submitted materials.
## Dataset Author Disclaimer
The dataset's author explicitly states no claim to content rights and is not responsible for its accuracy, legality, or appropriateness. Users are advised to exercise discretion and judgment when utilizing the dataset.
---
### Citation
```
@MISC{igorktech/anekdots,
author = {Igor Kuzmin},
title = {Russian anecdotes dump for 30 years},
url = {https://huggingface.co/datasets/igorktech/anekdots},
year = 2023
}
``` |
Steven0633/image50 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': arrange chairs
'1': arrange flowers
'2': bake potato
'3': beat eggs
'4': bend knee
'5': bend tree
'6': bind hair
'7': bite apple
'8': block door
'9': block window
'10': boil egg
'11': boil potato
'12': break bowl
'13': break cup
'14': break door
'15': break egg
'16': break glass
'17': break window
'18': burn book
'19': burn paper
'20': burn tree
'21': burn wood
'22': burst balloon
'23': burst door
'24': carry bag
'25': carry book
'26': carry umbrella
'27': chop carrot
'28': chop meat
'29': chop onion
'30': chop tree
'31': chop wood
'32': close book
'33': close cabinet
'34': close door
'35': close drawer
'36': close window
'37': coil rope
'38': cook egg
'39': cook meat
'40': cook onion
'41': cook potato
'42': crack bottle
'43': crack egg
'44': crack glass
'45': crack window
'46': crash car
'47': crop hair
'48': cut apple
'49': cut meat
'50': cut onion
'51': cut potato
'52': cut tree
'53': cut wood
'54': fasten door
'55': fasten window
'56': fold paper
'57': fry egg
'58': fry meat
'59': fry potato
'60': grate carrot
'61': grate potato
'62': grind meat
'63': hang bag
'64': hang shirt
'65': ignite paper
'66': ignite wood
'67': insert key
'68': kick door
'69': kick football
'70': knot rope
'71': label bottle
'72': label box
'73': lock cabinet
'74': lock door
'75': lock drawer
'76': lock window
'77': mash potato
'78': mix eggs
'79': open bottle
'80': open box
'81': open cabinet
'82': open door
'83': open drawer
'84': open umbrella
'85': open window
'86': park car
'87': peel apple
'88': peel banana
'89': peel carrot
'90': peel orange
'91': peel potato
'92': pile books
'93': pile boxes
'94': pile wood
'95': pitch baseball
'96': ride bicycle
'97': rip paper
'98': roll paper
'99': roll umbrella
'100': saw tree
'101': saw wood
'102': scratch car
'103': scratch knee
'104': shave hair
'105': shut door
'106': shut window
'107': skin knee
'108': slice apple
'109': slice meat
'110': slice onion
'111': slice potato
'112': smash door
'113': smash window
'114': soak hair
'115': soak shirt
'116': spill coffee
'117': split tree
'118': split wood
'119': squeeze bottle
'120': squeeze orange
'121': stain paper
'122': stain shirt
'123': stir coffee
'124': stir soup
'125': strip tree
'126': tear book
'127': tear paper
'128': tear shirt
'129': throw apple
'130': throw baseball
'131': throw football
'132': throw frisbee
'133': tie shoe
'134': trim hair
'135': trim tree
'136': twist hair
'137': twist rope
'138': wrap book
'139': wrap box
splits:
- name: train
num_bytes: 191648684.53815603
num_examples: 6126
- name: test
num_bytes: 20857643.465843983
num_examples: 681
download_size: 213918792
dataset_size: 212506328.004
---
# Dataset Card for "image50"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cmrc2018 | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- zh
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- extractive-qa
paperswithcode_id: cmrc-2018
pretty_name: Chinese Machine Reading Comprehension 2018
dataset_info:
features:
- name: id
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: train
num_bytes: 15508110
num_examples: 10142
- name: validation
num_bytes: 5183809
num_examples: 3219
- name: test
num_bytes: 1606931
num_examples: 1002
download_size: 11508117
dataset_size: 22298850
---
# Dataset Card for "cmrc2018"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://github.com/ymcui/cmrc2018](https://github.com/ymcui/cmrc2018)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 11.50 MB
- **Size of the generated dataset:** 22.31 MB
- **Total amount of disk used:** 33.83 MB
### Dataset Summary
A Span-Extraction dataset for Chinese machine reading comprehension to add language
diversities in this area. The dataset is composed by near 20,000 real questions annotated
on Wikipedia paragraphs by human experts. We also annotated a challenge set which
contains the questions that need comprehensive understanding and multi-sentence
inference throughout the context.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### default
- **Size of downloaded dataset files:** 11.50 MB
- **Size of the generated dataset:** 22.31 MB
- **Total amount of disk used:** 33.83 MB
An example of 'validation' looks as follows.
```
This example was too long and was cropped:
{
"answers": {
"answer_start": [11, 11],
"text": ["光荣和ω-force", "光荣和ω-force"]
},
"context": "\"《战国无双3》()是由光荣和ω-force开发的战国无双系列的正统第三续作。本作以三大故事为主轴,分别是以武田信玄等人为主的《关东三国志》,织田信长等人为主的《战国三杰》,石田三成等人为主的《关原的年轻武者》,丰富游戏内的剧情。此部份专门介绍角色,欲知武...",
"id": "DEV_0_QUERY_0",
"question": "《战国无双3》是由哪两个公司合作开发的?"
}
```
### Data Fields
The data fields are the same among all splits.
#### default
- `id`: a `string` feature.
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `text`: a `string` feature.
- `answer_start`: a `int32` feature.
### Data Splits
| name | train | validation | test |
| ------- | ----: | ---------: | ---: |
| default | 10142 | 3219 | 1002 |
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@inproceedings{cui-emnlp2019-cmrc2018,
title = "A Span-Extraction Dataset for {C}hinese Machine Reading Comprehension",
author = "Cui, Yiming and
Liu, Ting and
Che, Wanxiang and
Xiao, Li and
Chen, Zhipeng and
Ma, Wentao and
Wang, Shijin and
Hu, Guoping",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)",
month = nov,
year = "2019",
address = "Hong Kong, China",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/D19-1600",
doi = "10.18653/v1/D19-1600",
pages = "5886--5891",
}
```
### Contributions
Thanks to [@patrickvonplaten](https://github.com/patrickvonplaten), [@mariamabarham](https://github.com/mariamabarham), [@lewtun](https://github.com/lewtun), [@thomwolf](https://github.com/thomwolf) for adding this dataset. |
open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-300step-flan-v2 | ---
pretty_name: Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-300step-flan-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Korabbit/Llama-2-7b-chat-hf-afr-300step-flan-v2](https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-300step-flan-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-300step-flan-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-06T16:40:21.068162](https://huggingface.co/datasets/open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-300step-flan-v2/blob/main/results_2023-12-06T16-40-21.068162.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4858318036904494,\n\
\ \"acc_stderr\": 0.03428773546743271,\n \"acc_norm\": 0.4907011751374352,\n\
\ \"acc_norm_stderr\": 0.03504506485866877,\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.016021570613768545,\n \"mc2\": 0.45138129313940284,\n\
\ \"mc2_stderr\": 0.015562220951147801\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49146757679180886,\n \"acc_stderr\": 0.014609263165632191,\n\
\ \"acc_norm\": 0.5255972696245734,\n \"acc_norm_stderr\": 0.014592230885298964\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5911173073093009,\n\
\ \"acc_stderr\": 0.004906227902850758,\n \"acc_norm\": 0.7776339374626569,\n\
\ \"acc_norm_stderr\": 0.004149859300604911\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.539622641509434,\n \"acc_stderr\": 0.030676096599389184,\n\
\ \"acc_norm\": 0.539622641509434,\n \"acc_norm_stderr\": 0.030676096599389184\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4046242774566474,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.4046242774566474,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29365079365079366,\n \"acc_stderr\": 0.02345603738398203,\n \"\
acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.02345603738398203\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.038095238095238126,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.038095238095238126\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5258064516129032,\n\
\ \"acc_stderr\": 0.02840609505765332,\n \"acc_norm\": 0.5258064516129032,\n\
\ \"acc_norm_stderr\": 0.02840609505765332\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.03395970381998573,\n\
\ \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.03395970381998573\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.0381549430868893,\n\
\ \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.0381549430868893\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.601010101010101,\n \"acc_stderr\": 0.03488901616852732,\n \"acc_norm\"\
: 0.601010101010101,\n \"acc_norm_stderr\": 0.03488901616852732\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7253886010362695,\n \"acc_stderr\": 0.03221024508041153,\n\
\ \"acc_norm\": 0.7253886010362695,\n \"acc_norm_stderr\": 0.03221024508041153\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4282051282051282,\n \"acc_stderr\": 0.025088301454694834,\n\
\ \"acc_norm\": 0.4282051282051282,\n \"acc_norm_stderr\": 0.025088301454694834\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844082,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844082\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42436974789915966,\n \"acc_stderr\": 0.03210479051015776,\n\
\ \"acc_norm\": 0.42436974789915966,\n \"acc_norm_stderr\": 0.03210479051015776\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.671559633027523,\n \"acc_stderr\": 0.02013590279729841,\n \"acc_norm\"\
: 0.671559633027523,\n \"acc_norm_stderr\": 0.02013590279729841\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3287037037037037,\n\
\ \"acc_stderr\": 0.032036140846700596,\n \"acc_norm\": 0.3287037037037037,\n\
\ \"acc_norm_stderr\": 0.032036140846700596\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.033086111132364336,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.033086111132364336\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6624472573839663,\n \"acc_stderr\": 0.030781549102026226,\n \
\ \"acc_norm\": 0.6624472573839663,\n \"acc_norm_stderr\": 0.030781549102026226\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n\
\ \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.5650224215246636,\n\
\ \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.628099173553719,\n \"acc_stderr\": 0.04412015806624504,\n \"acc_norm\"\
: 0.628099173553719,\n \"acc_norm_stderr\": 0.04412015806624504\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6018518518518519,\n\
\ \"acc_stderr\": 0.04732332615978813,\n \"acc_norm\": 0.6018518518518519,\n\
\ \"acc_norm_stderr\": 0.04732332615978813\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5644171779141104,\n \"acc_stderr\": 0.03895632464138937,\n\
\ \"acc_norm\": 0.5644171779141104,\n \"acc_norm_stderr\": 0.03895632464138937\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.044328040552915185,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.044328040552915185\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.02934311479809446,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.02934311479809446\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6768837803320562,\n\
\ \"acc_stderr\": 0.016723726512343048,\n \"acc_norm\": 0.6768837803320562,\n\
\ \"acc_norm_stderr\": 0.016723726512343048\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.026907849856282542,\n\
\ \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.026907849856282542\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22681564245810057,\n\
\ \"acc_stderr\": 0.014005843570897899,\n \"acc_norm\": 0.22681564245810057,\n\
\ \"acc_norm_stderr\": 0.014005843570897899\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5098039215686274,\n \"acc_stderr\": 0.028624412550167958,\n\
\ \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.028624412550167958\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.572347266881029,\n\
\ \"acc_stderr\": 0.028099240775809553,\n \"acc_norm\": 0.572347266881029,\n\
\ \"acc_norm_stderr\": 0.028099240775809553\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5679012345679012,\n \"acc_stderr\": 0.02756301097160668,\n\
\ \"acc_norm\": 0.5679012345679012,\n \"acc_norm_stderr\": 0.02756301097160668\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36879432624113473,\n \"acc_stderr\": 0.02878222756134724,\n \
\ \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.02878222756134724\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3500651890482399,\n\
\ \"acc_stderr\": 0.012182552313215172,\n \"acc_norm\": 0.3500651890482399,\n\
\ \"acc_norm_stderr\": 0.012182552313215172\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4742647058823529,\n \"acc_stderr\": 0.03033257809455504,\n\
\ \"acc_norm\": 0.4742647058823529,\n \"acc_norm_stderr\": 0.03033257809455504\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4934640522875817,\n \"acc_stderr\": 0.020226106567657807,\n \
\ \"acc_norm\": 0.4934640522875817,\n \"acc_norm_stderr\": 0.020226106567657807\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5102040816326531,\n \"acc_stderr\": 0.03200255347893782,\n\
\ \"acc_norm\": 0.5102040816326531,\n \"acc_norm_stderr\": 0.03200255347893782\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n\
\ \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n\
\ \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.03446296217088427,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.03446296217088427\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.016021570613768545,\n \"mc2\": 0.45138129313940284,\n\
\ \"mc2_stderr\": 0.015562220951147801\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7253354380426204,\n \"acc_stderr\": 0.012544516005117187\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17968157695223655,\n \
\ \"acc_stderr\": 0.01057511996424224\n }\n}\n```"
repo_url: https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-300step-flan-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|arc:challenge|25_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|gsm8k|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hellaswag|10_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-06T16-40-21.068162.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-06T16-40-21.068162.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- '**/details_harness|winogrande|5_2023-12-06T16-40-21.068162.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-06T16-40-21.068162.parquet'
- config_name: results
data_files:
- split: 2023_12_06T16_40_21.068162
path:
- results_2023-12-06T16-40-21.068162.parquet
- split: latest
path:
- results_2023-12-06T16-40-21.068162.parquet
---
# Dataset Card for Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-300step-flan-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-300step-flan-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Korabbit/Llama-2-7b-chat-hf-afr-300step-flan-v2](https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-300step-flan-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-300step-flan-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-06T16:40:21.068162](https://huggingface.co/datasets/open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-300step-flan-v2/blob/main/results_2023-12-06T16-40-21.068162.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4858318036904494,
"acc_stderr": 0.03428773546743271,
"acc_norm": 0.4907011751374352,
"acc_norm_stderr": 0.03504506485866877,
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768545,
"mc2": 0.45138129313940284,
"mc2_stderr": 0.015562220951147801
},
"harness|arc:challenge|25": {
"acc": 0.49146757679180886,
"acc_stderr": 0.014609263165632191,
"acc_norm": 0.5255972696245734,
"acc_norm_stderr": 0.014592230885298964
},
"harness|hellaswag|10": {
"acc": 0.5911173073093009,
"acc_stderr": 0.004906227902850758,
"acc_norm": 0.7776339374626569,
"acc_norm_stderr": 0.004149859300604911
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.46710526315789475,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.46710526315789475,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.539622641509434,
"acc_stderr": 0.030676096599389184,
"acc_norm": 0.539622641509434,
"acc_norm_stderr": 0.030676096599389184
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4046242774566474,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.4046242774566474,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.02345603738398203,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.02345603738398203
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.038095238095238126,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.038095238095238126
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5258064516129032,
"acc_stderr": 0.02840609505765332,
"acc_norm": 0.5258064516129032,
"acc_norm_stderr": 0.02840609505765332
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.03395970381998573,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.03395970381998573
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.0381549430868893,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.0381549430868893
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.601010101010101,
"acc_stderr": 0.03488901616852732,
"acc_norm": 0.601010101010101,
"acc_norm_stderr": 0.03488901616852732
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7253886010362695,
"acc_stderr": 0.03221024508041153,
"acc_norm": 0.7253886010362695,
"acc_norm_stderr": 0.03221024508041153
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4282051282051282,
"acc_stderr": 0.025088301454694834,
"acc_norm": 0.4282051282051282,
"acc_norm_stderr": 0.025088301454694834
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844082,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42436974789915966,
"acc_stderr": 0.03210479051015776,
"acc_norm": 0.42436974789915966,
"acc_norm_stderr": 0.03210479051015776
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.671559633027523,
"acc_stderr": 0.02013590279729841,
"acc_norm": 0.671559633027523,
"acc_norm_stderr": 0.02013590279729841
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3287037037037037,
"acc_stderr": 0.032036140846700596,
"acc_norm": 0.3287037037037037,
"acc_norm_stderr": 0.032036140846700596
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.033086111132364336,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.033086111132364336
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6624472573839663,
"acc_stderr": 0.030781549102026226,
"acc_norm": 0.6624472573839663,
"acc_norm_stderr": 0.030781549102026226
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5650224215246636,
"acc_stderr": 0.033272833702713445,
"acc_norm": 0.5650224215246636,
"acc_norm_stderr": 0.033272833702713445
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.04412015806624504,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.04412015806624504
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.04732332615978813,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.04732332615978813
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5644171779141104,
"acc_stderr": 0.03895632464138937,
"acc_norm": 0.5644171779141104,
"acc_norm_stderr": 0.03895632464138937
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.044328040552915185,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.044328040552915185
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.02934311479809446,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.02934311479809446
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6768837803320562,
"acc_stderr": 0.016723726512343048,
"acc_norm": 0.6768837803320562,
"acc_norm_stderr": 0.016723726512343048
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.026907849856282542,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.026907849856282542
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22681564245810057,
"acc_stderr": 0.014005843570897899,
"acc_norm": 0.22681564245810057,
"acc_norm_stderr": 0.014005843570897899
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.028624412550167958,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.028624412550167958
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.572347266881029,
"acc_stderr": 0.028099240775809553,
"acc_norm": 0.572347266881029,
"acc_norm_stderr": 0.028099240775809553
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5679012345679012,
"acc_stderr": 0.02756301097160668,
"acc_norm": 0.5679012345679012,
"acc_norm_stderr": 0.02756301097160668
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.02878222756134724,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.02878222756134724
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3500651890482399,
"acc_stderr": 0.012182552313215172,
"acc_norm": 0.3500651890482399,
"acc_norm_stderr": 0.012182552313215172
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4742647058823529,
"acc_stderr": 0.03033257809455504,
"acc_norm": 0.4742647058823529,
"acc_norm_stderr": 0.03033257809455504
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4934640522875817,
"acc_stderr": 0.020226106567657807,
"acc_norm": 0.4934640522875817,
"acc_norm_stderr": 0.020226106567657807
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5102040816326531,
"acc_stderr": 0.03200255347893782,
"acc_norm": 0.5102040816326531,
"acc_norm_stderr": 0.03200255347893782
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6467661691542289,
"acc_stderr": 0.03379790611796777,
"acc_norm": 0.6467661691542289,
"acc_norm_stderr": 0.03379790611796777
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.03446296217088427,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.03446296217088427
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768545,
"mc2": 0.45138129313940284,
"mc2_stderr": 0.015562220951147801
},
"harness|winogrande|5": {
"acc": 0.7253354380426204,
"acc_stderr": 0.012544516005117187
},
"harness|gsm8k|5": {
"acc": 0.17968157695223655,
"acc_stderr": 0.01057511996424224
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
sinkcup/openspd | ---
license: cc-by-4.0
configs:
- config_name: "电视"
info: "家电 > 电视"
data_files:
- split: train
path:
- "电视/train/0000.csv"
sep: ","
- config_name: "汽车"
info: "交通工具 > 汽车"
data_files:
- split: train
path:
- "汽车/train/0000.csv"
sep: ","
---
|
polytechXhf/onepiece-dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: char_name
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 120488910.0
num_examples: 922
download_size: 120447392
dataset_size: 120488910.0
---
# Dataset Card for "onepiece-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_118 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1428346736
num_examples: 280508
download_size: 1458298434
dataset_size: 1428346736
---
# Dataset Card for "chunk_118"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/OxfordPets_test_embeddings | ---
dataset_info:
features:
- name: image
dtype: image
- name: id
dtype: int64
- name: vision_embeddings
sequence: float32
splits:
- name: openai_clip_vit_large_patch14
num_bytes: 424231302.0
num_examples: 3669
download_size: 426276832
dataset_size: 424231302.0
---
# Dataset Card for "OxfordPets_test_embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Matthijs/snacks | ---
pretty_name: Snacks
task_categories:
- image-classification
- computer-vision
license: cc-by-4.0
---
# Dataset Card for Snacks
## Dataset Summary
This is a dataset of 20 different types of snack foods that accompanies the book [Machine Learning by Tutorials](https://www.raywenderlich.com/books/machine-learning-by-tutorials/v2.0).
The images were taken from the [Google Open Images dataset](https://storage.googleapis.com/openimages/web/index.html), release 2017_11.
## Dataset Structure
Number of images in the train/validation/test splits:
```nohighlight
train 4838
val 955
test 952
total 6745
```
Total images in each category:
```nohighlight
apple 350
banana 350
cake 349
candy 349
carrot 349
cookie 349
doughnut 350
grape 350
hot dog 350
ice cream 350
juice 350
muffin 348
orange 349
pineapple 340
popcorn 260
pretzel 204
salad 350
strawberry 348
waffle 350
watermelon 350
```
To save space in the download, the images were resized so that their smallest side is 256 pixels. All EXIF information was removed.
### Data Splits
Train, Test, Validation
## Licensing Information
Just like the images from Google Open Images, the snacks dataset is licensed under the terms of the Creative Commons license.
The images are listed as having a [CC BY 2.0](https://creativecommons.org/licenses/by/2.0/) license.
The annotations are licensed by Google Inc. under a [CC BY 4.0](https://creativecommons.org/licenses/by/4.0/) license.
The **credits.csv** file contains the original URL, author information and license for each image.
|
sourcerror/ply | ---
license: mit
---
|
eb/num100000test | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 133596122.4
num_examples: 90000
- name: test
num_bytes: 14844013.6
num_examples: 10000
download_size: 84945913
dataset_size: 148440136.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
tiansz/ChineseSTS | ---
license: apache-2.0
task_categories:
- sentence-similarity
language:
- zh
tags:
- STS
size_categories:
- 1M<n<10M
---
这是一个中文文本相似度的数据集,相似度划分为 0、1。
该 [notebook](https://www.kaggle.com/code/tiansztianszs/chinese-sentence-similarity) 记录了我使用本数据集的全过程。同时你也可以在 [github](https://github.com/tiansztiansz/Chinese-Text-Similarity) 上下载该数据集 |
ericyu3/openassistant_inpainted_dialogs_5k_biomedical | ---
license: apache-2.0
size_categories:
- 1K<n<10K
---
This dataset was created by:
* Starting with the [Dialog Inpainting](https://github.com/google-research/dialog-inpainting) dataset
* Labeling the turns of each dialog with `User: ` and `Assistant: `
* Filtering using spaCy, using code similar to the following (written by https://huggingface.co/ontocord):
```
import pandas as pd
import spacy
try:
if sci is None: pass
except:
sci = spacy.load("en_ner_craft_md")
data = pd.read_parquet('data.parquet', engine='pyarrow')
for a in data['labeleddialog']:
a = a.replace("this article", "this subject").replace("()", "").replace(" ", " ")
if 'novel' in a or ' story' in a or 'movie' in a or 'film' in a or 'music' in a:
#print ('###arts\n', a)
continue
if ' game' in a or 'sports' in a or 'football' in a or 'soccer' in a or 'baseball' in a or 'basketball' in a:
#print ('###sports\n', a)
continue
if 'population' in a or 'territory' in a or 'village' in a or 'country' in a or 'county' in a:
#print ('###place\n', a)
continue
if 'ingredient' in a or 'food' in a or 'recipe' in a:
#print ('###recipe\n', a)
continue
if ' rights' in a or ' court ' in a or ' criminal ' in a or ' verdict ' in a or ' guilt ' in a or ' legislat' in a:
#print ('###law\n', a)
continue
doc = sci(a)
j = 0
for ent in doc.ents:
if ent.label == 'SO' or (ent.label == 'CHEBI' and len(ent.text) > 5):
j+= 1
if j > 3:
print ('###biomed\n',a)
break
#print (ent.label, ent.text)
```
* Filtering using BERT, using the following code:
```
from transformers import pipeline
classifier = pipeline("zero-shot-classification", model="facebook/bart-large-mnli")
classifier(page_titles, ["Biomedical", "Non-biomedical"])
# Dialogs with page titles with `prob < 0.7` were dropped.
prob = classification_result["scores"][classification_result["labels"].index("Biomedical")]
``` |
Tristan/wikipedia-august-october-line-diff-1000-char-threshold | ---
dataset_info:
features:
- name: url
dtype: string
- name: text
dtype: string
- name: crawl_timestamp
dtype: int64
- name: reward
dtype: int64
splits:
- name: train
num_bytes: 403299007
num_examples: 285657
download_size: 161874884
dataset_size: 403299007
---
# Dataset Card for "wikipedia-august-october-line-diff-1000-char-threshold"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_martyn__mixtral-megamerge-dare-8x7b-v2 | ---
pretty_name: Evaluation run of martyn/mixtral-megamerge-dare-8x7b-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [martyn/mixtral-megamerge-dare-8x7b-v2](https://huggingface.co/martyn/mixtral-megamerge-dare-8x7b-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_martyn__mixtral-megamerge-dare-8x7b-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-14T07:03:35.967501](https://huggingface.co/datasets/open-llm-leaderboard/details_martyn__mixtral-megamerge-dare-8x7b-v2/blob/main/results_2024-01-14T07-03-35.967501.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6893459569280364,\n\
\ \"acc_stderr\": 0.030858049040324388,\n \"acc_norm\": 0.6938293567967714,\n\
\ \"acc_norm_stderr\": 0.03145368794832943,\n \"mc1\": 0.397796817625459,\n\
\ \"mc1_stderr\": 0.017133934248559635,\n \"mc2\": 0.5381182686685855,\n\
\ \"mc2_stderr\": 0.0153563125426782\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6262798634812287,\n \"acc_stderr\": 0.014137708601759091,\n\
\ \"acc_norm\": 0.6646757679180887,\n \"acc_norm_stderr\": 0.013796182947785562\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6766580362477594,\n\
\ \"acc_stderr\": 0.004667960519938637,\n \"acc_norm\": 0.8610834495120494,\n\
\ \"acc_norm_stderr\": 0.003451525868724678\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n\
\ \"acc_stderr\": 0.040247784019771096,\n \"acc_norm\": 0.6814814814814815,\n\
\ \"acc_norm_stderr\": 0.040247784019771096\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8092105263157895,\n \"acc_stderr\": 0.031975658210325,\n\
\ \"acc_norm\": 0.8092105263157895,\n \"acc_norm_stderr\": 0.031975658210325\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7660377358490567,\n \"acc_stderr\": 0.02605529690115292,\n\
\ \"acc_norm\": 0.7660377358490567,\n \"acc_norm_stderr\": 0.02605529690115292\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n\
\ \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n\
\ \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7052023121387283,\n\
\ \"acc_stderr\": 0.03476599607516477,\n \"acc_norm\": 0.7052023121387283,\n\
\ \"acc_norm_stderr\": 0.03476599607516477\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6680851063829787,\n \"acc_stderr\": 0.030783736757745657,\n\
\ \"acc_norm\": 0.6680851063829787,\n \"acc_norm_stderr\": 0.030783736757745657\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5789473684210527,\n\
\ \"acc_stderr\": 0.04644602091222317,\n \"acc_norm\": 0.5789473684210527,\n\
\ \"acc_norm_stderr\": 0.04644602091222317\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6620689655172414,\n \"acc_stderr\": 0.039417076320648906,\n\
\ \"acc_norm\": 0.6620689655172414,\n \"acc_norm_stderr\": 0.039417076320648906\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47619047619047616,\n \"acc_stderr\": 0.02572209706438853,\n \"\
acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.02572209706438853\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268556,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268556\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5566502463054187,\n \"acc_stderr\": 0.03495334582162933,\n\
\ \"acc_norm\": 0.5566502463054187,\n \"acc_norm_stderr\": 0.03495334582162933\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n\
\ \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.02655220782821529,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02655220782821529\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678185,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678185\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \
\ \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857403,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857403\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7184873949579832,\n \"acc_stderr\": 0.029213549414372167,\n\
\ \"acc_norm\": 0.7184873949579832,\n \"acc_norm_stderr\": 0.029213549414372167\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4304635761589404,\n \"acc_stderr\": 0.040428099613956346,\n \"\
acc_norm\": 0.4304635761589404,\n \"acc_norm_stderr\": 0.040428099613956346\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660836,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660836\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.024509803921568606,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.024509803921568606\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8734177215189873,\n \"acc_stderr\": 0.02164419572795517,\n \
\ \"acc_norm\": 0.8734177215189873,\n \"acc_norm_stderr\": 0.02164419572795517\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7623318385650224,\n\
\ \"acc_stderr\": 0.028568079464714284,\n \"acc_norm\": 0.7623318385650224,\n\
\ \"acc_norm_stderr\": 0.028568079464714284\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807194,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807194\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n\
\ \"acc_stderr\": 0.03520703990517963,\n \"acc_norm\": 0.8425925925925926,\n\
\ \"acc_norm_stderr\": 0.03520703990517963\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n\
\ \"acc_stderr\": 0.01911989279892498,\n \"acc_norm\": 0.905982905982906,\n\
\ \"acc_norm_stderr\": 0.01911989279892498\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8825031928480205,\n\
\ \"acc_stderr\": 0.011515102251977185,\n \"acc_norm\": 0.8825031928480205,\n\
\ \"acc_norm_stderr\": 0.011515102251977185\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7774566473988439,\n \"acc_stderr\": 0.02239421566194282,\n\
\ \"acc_norm\": 0.7774566473988439,\n \"acc_norm_stderr\": 0.02239421566194282\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43687150837988825,\n\
\ \"acc_stderr\": 0.016588680864530622,\n \"acc_norm\": 0.43687150837988825,\n\
\ \"acc_norm_stderr\": 0.016588680864530622\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824785,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824785\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7459807073954984,\n\
\ \"acc_stderr\": 0.024723861504771696,\n \"acc_norm\": 0.7459807073954984,\n\
\ \"acc_norm_stderr\": 0.024723861504771696\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8179012345679012,\n \"acc_stderr\": 0.02147349183480834,\n\
\ \"acc_norm\": 0.8179012345679012,\n \"acc_norm_stderr\": 0.02147349183480834\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5260756192959583,\n\
\ \"acc_stderr\": 0.012752858346533143,\n \"acc_norm\": 0.5260756192959583,\n\
\ \"acc_norm_stderr\": 0.012752858346533143\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.027678468642144714,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.027678468642144714\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7565359477124183,\n \"acc_stderr\": 0.017362473762146627,\n \
\ \"acc_norm\": 0.7565359477124183,\n \"acc_norm_stderr\": 0.017362473762146627\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.763265306122449,\n \"acc_stderr\": 0.027212835884073142,\n\
\ \"acc_norm\": 0.763265306122449,\n \"acc_norm_stderr\": 0.027212835884073142\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776348,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776348\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.02464806896136615,\n\
\ \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.02464806896136615\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.397796817625459,\n\
\ \"mc1_stderr\": 0.017133934248559635,\n \"mc2\": 0.5381182686685855,\n\
\ \"mc2_stderr\": 0.0153563125426782\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.797947908445146,\n \"acc_stderr\": 0.011285013754047443\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5390447308567097,\n \
\ \"acc_stderr\": 0.01373042844911634\n }\n}\n```"
repo_url: https://huggingface.co/martyn/mixtral-megamerge-dare-8x7b-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|arc:challenge|25_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|arc:challenge|25_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|gsm8k|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|gsm8k|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hellaswag|10_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hellaswag|10_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T05-29-42.877367.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T07-03-35.967501.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T07-03-35.967501.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- '**/details_harness|winogrande|5_2023-12-30T05-29-42.877367.parquet'
- split: 2024_01_14T07_03_35.967501
path:
- '**/details_harness|winogrande|5_2024-01-14T07-03-35.967501.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-14T07-03-35.967501.parquet'
- config_name: results
data_files:
- split: 2023_12_30T05_29_42.877367
path:
- results_2023-12-30T05-29-42.877367.parquet
- split: 2024_01_14T07_03_35.967501
path:
- results_2024-01-14T07-03-35.967501.parquet
- split: latest
path:
- results_2024-01-14T07-03-35.967501.parquet
---
# Dataset Card for Evaluation run of martyn/mixtral-megamerge-dare-8x7b-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [martyn/mixtral-megamerge-dare-8x7b-v2](https://huggingface.co/martyn/mixtral-megamerge-dare-8x7b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_martyn__mixtral-megamerge-dare-8x7b-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T07:03:35.967501](https://huggingface.co/datasets/open-llm-leaderboard/details_martyn__mixtral-megamerge-dare-8x7b-v2/blob/main/results_2024-01-14T07-03-35.967501.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6893459569280364,
"acc_stderr": 0.030858049040324388,
"acc_norm": 0.6938293567967714,
"acc_norm_stderr": 0.03145368794832943,
"mc1": 0.397796817625459,
"mc1_stderr": 0.017133934248559635,
"mc2": 0.5381182686685855,
"mc2_stderr": 0.0153563125426782
},
"harness|arc:challenge|25": {
"acc": 0.6262798634812287,
"acc_stderr": 0.014137708601759091,
"acc_norm": 0.6646757679180887,
"acc_norm_stderr": 0.013796182947785562
},
"harness|hellaswag|10": {
"acc": 0.6766580362477594,
"acc_stderr": 0.004667960519938637,
"acc_norm": 0.8610834495120494,
"acc_norm_stderr": 0.003451525868724678
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.040247784019771096,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.040247784019771096
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8092105263157895,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.8092105263157895,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7660377358490567,
"acc_stderr": 0.02605529690115292,
"acc_norm": 0.7660377358490567,
"acc_norm_stderr": 0.02605529690115292
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.03476599607516477,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.03476599607516477
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6680851063829787,
"acc_stderr": 0.030783736757745657,
"acc_norm": 0.6680851063829787,
"acc_norm_stderr": 0.030783736757745657
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.04644602091222317,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.04644602091222317
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6620689655172414,
"acc_stderr": 0.039417076320648906,
"acc_norm": 0.6620689655172414,
"acc_norm_stderr": 0.039417076320648906
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.02572209706438853,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.02572209706438853
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268556,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268556
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5566502463054187,
"acc_stderr": 0.03495334582162933,
"acc_norm": 0.5566502463054187,
"acc_norm_stderr": 0.03495334582162933
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02655220782821529,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02655220782821529
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678185,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678185
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857403,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857403
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7184873949579832,
"acc_stderr": 0.029213549414372167,
"acc_norm": 0.7184873949579832,
"acc_norm_stderr": 0.029213549414372167
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4304635761589404,
"acc_stderr": 0.040428099613956346,
"acc_norm": 0.4304635761589404,
"acc_norm_stderr": 0.040428099613956346
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660836,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660836
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.024509803921568606,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.024509803921568606
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8734177215189873,
"acc_stderr": 0.02164419572795517,
"acc_norm": 0.8734177215189873,
"acc_norm_stderr": 0.02164419572795517
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7623318385650224,
"acc_stderr": 0.028568079464714284,
"acc_norm": 0.7623318385650224,
"acc_norm_stderr": 0.028568079464714284
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807194,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807194
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.03520703990517963,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.03520703990517963
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.01911989279892498,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.01911989279892498
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8825031928480205,
"acc_stderr": 0.011515102251977185,
"acc_norm": 0.8825031928480205,
"acc_norm_stderr": 0.011515102251977185
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7774566473988439,
"acc_stderr": 0.02239421566194282,
"acc_norm": 0.7774566473988439,
"acc_norm_stderr": 0.02239421566194282
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43687150837988825,
"acc_stderr": 0.016588680864530622,
"acc_norm": 0.43687150837988825,
"acc_norm_stderr": 0.016588680864530622
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.024630048979824785,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.024630048979824785
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7459807073954984,
"acc_stderr": 0.024723861504771696,
"acc_norm": 0.7459807073954984,
"acc_norm_stderr": 0.024723861504771696
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8179012345679012,
"acc_stderr": 0.02147349183480834,
"acc_norm": 0.8179012345679012,
"acc_norm_stderr": 0.02147349183480834
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5260756192959583,
"acc_stderr": 0.012752858346533143,
"acc_norm": 0.5260756192959583,
"acc_norm_stderr": 0.012752858346533143
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.027678468642144714,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.027678468642144714
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7565359477124183,
"acc_stderr": 0.017362473762146627,
"acc_norm": 0.7565359477124183,
"acc_norm_stderr": 0.017362473762146627
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.763265306122449,
"acc_stderr": 0.027212835884073142,
"acc_norm": 0.763265306122449,
"acc_norm_stderr": 0.027212835884073142
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776348,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776348
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.02464806896136615,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.02464806896136615
},
"harness|truthfulqa:mc|0": {
"mc1": 0.397796817625459,
"mc1_stderr": 0.017133934248559635,
"mc2": 0.5381182686685855,
"mc2_stderr": 0.0153563125426782
},
"harness|winogrande|5": {
"acc": 0.797947908445146,
"acc_stderr": 0.011285013754047443
},
"harness|gsm8k|5": {
"acc": 0.5390447308567097,
"acc_stderr": 0.01373042844911634
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
yuiseki/scp-jp-plain | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 10202611
num_examples: 999
download_size: 5333180
dataset_size: 10202611
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc-by-sa-3.0
--- |
Hack90/chikungunya | ---
dataset_info:
features:
- name: id
dtype: string
- name: sequence
dtype: string
- name: name
dtype: string
- name: description
dtype: string
- name: features
dtype: int64
- name: seq_length
dtype: int64
splits:
- name: train
num_bytes: 4582345.313267142
num_examples: 1801
download_size: 8874613
dataset_size: 4582345.313267142
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ML4CO/SATLIBOriDataset | ---
license: apache-2.0
---
|
tingchih/KG_perceiver_MLM_100 | ---
dataset_info:
features:
- name: KG
dtype: string
splits:
- name: Train
num_bytes: 3928929
num_examples: 70
- name: Test
num_bytes: 1942130
num_examples: 32
download_size: 1854285
dataset_size: 5871059
---
# Dataset Card for "KG_perceiver_MLM_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-one-sec-cv12-each-chunk-uniq/chunk_105 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 978200256.0
num_examples: 190608
download_size: 1002249319
dataset_size: 978200256.0
---
# Dataset Card for "chunk_105"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_monology__mixtral-soup | ---
pretty_name: Evaluation run of monology/mixtral-soup
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [monology/mixtral-soup](https://huggingface.co/monology/mixtral-soup) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_monology__mixtral-soup\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T21:31:12.732008](https://huggingface.co/datasets/open-llm-leaderboard/details_monology__mixtral-soup/blob/main/results_2024-03-21T21-31-12.732008.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2615539890670816,\n\
\ \"acc_stderr\": 0.030988554116831494,\n \"acc_norm\": 0.2622770504848914,\n\
\ \"acc_norm_stderr\": 0.031810879126027564,\n \"mc1\": 0.28886168910648713,\n\
\ \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": 0.4994069384927306,\n\
\ \"mc2_stderr\": 0.01625998465200045\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.21160409556313994,\n \"acc_stderr\": 0.011935916358632847,\n\
\ \"acc_norm\": 0.23976109215017063,\n \"acc_norm_stderr\": 0.012476304127453935\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.26239792869946227,\n\
\ \"acc_stderr\": 0.0043903867754005324,\n \"acc_norm\": 0.27076279625572597,\n\
\ \"acc_norm_stderr\": 0.00443445671709759\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.21481481481481482,\n\
\ \"acc_stderr\": 0.03547854198560827,\n \"acc_norm\": 0.21481481481481482,\n\
\ \"acc_norm_stderr\": 0.03547854198560827\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.025288394502891363,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.025288394502891363\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2774566473988439,\n\
\ \"acc_stderr\": 0.03414014007044036,\n \"acc_norm\": 0.2774566473988439,\n\
\ \"acc_norm_stderr\": 0.03414014007044036\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n\
\ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.028020226271200217,\n\
\ \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.028020226271200217\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.03333333333333329,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.03333333333333329\n },\n\
\ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n\
\ \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n\
\ \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.1984126984126984,\n \"acc_stderr\": 0.03567016675276864,\n\
\ \"acc_norm\": 0.1984126984126984,\n \"acc_norm_stderr\": 0.03567016675276864\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.19,\n\
\ \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.19,\n \
\ \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.3161290322580645,\n \"acc_stderr\": 0.02645087448904277,\n\
\ \"acc_norm\": 0.3161290322580645,\n \"acc_norm_stderr\": 0.02645087448904277\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358609,\n \"\
acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358609\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2727272727272727,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.29015544041450775,\n \"acc_stderr\": 0.03275264467791515,\n\
\ \"acc_norm\": 0.29015544041450775,\n \"acc_norm_stderr\": 0.03275264467791515\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.22564102564102564,\n \"acc_stderr\": 0.021193632525148547,\n\
\ \"acc_norm\": 0.22564102564102564,\n \"acc_norm_stderr\": 0.021193632525148547\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22962962962962963,\n \"acc_stderr\": 0.02564410863926764,\n \
\ \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.02564410863926764\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3487394957983193,\n \"acc_stderr\": 0.030956636328566548,\n\
\ \"acc_norm\": 0.3487394957983193,\n \"acc_norm_stderr\": 0.030956636328566548\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.30458715596330277,\n \"acc_stderr\": 0.01973229942035404,\n \"\
acc_norm\": 0.30458715596330277,\n \"acc_norm_stderr\": 0.01973229942035404\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24019607843137256,\n\
\ \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.24019607843137256,\n\
\ \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.2616033755274262,\n \"acc_stderr\": 0.02860951671699494,\n\
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.02860951671699494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.17937219730941703,\n\
\ \"acc_stderr\": 0.025749819569192794,\n \"acc_norm\": 0.17937219730941703,\n\
\ \"acc_norm_stderr\": 0.025749819569192794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2900763358778626,\n \"acc_stderr\": 0.03980066246467766,\n\
\ \"acc_norm\": 0.2900763358778626,\n \"acc_norm_stderr\": 0.03980066246467766\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2975206611570248,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.2975206611570248,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.033220157957767414,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.033220157957767414\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.19642857142857142,\n\
\ \"acc_stderr\": 0.03770970049347018,\n \"acc_norm\": 0.19642857142857142,\n\
\ \"acc_norm_stderr\": 0.03770970049347018\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3300970873786408,\n \"acc_stderr\": 0.046561471100123486,\n\
\ \"acc_norm\": 0.3300970873786408,\n \"acc_norm_stderr\": 0.046561471100123486\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2541507024265645,\n\
\ \"acc_stderr\": 0.015569254692045778,\n \"acc_norm\": 0.2541507024265645,\n\
\ \"acc_norm_stderr\": 0.015569254692045778\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.22254335260115607,\n \"acc_stderr\": 0.02239421566194282,\n\
\ \"acc_norm\": 0.22254335260115607,\n \"acc_norm_stderr\": 0.02239421566194282\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n\
\ \"acc_stderr\": 0.014444157808261427,\n \"acc_norm\": 0.24804469273743016,\n\
\ \"acc_norm_stderr\": 0.014444157808261427\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351294,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351294\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2508038585209003,\n\
\ \"acc_stderr\": 0.024619771956697165,\n \"acc_norm\": 0.2508038585209003,\n\
\ \"acc_norm_stderr\": 0.024619771956697165\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2716049382716049,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.2716049382716049,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2127659574468085,\n \"acc_stderr\": 0.0244146129743077,\n \
\ \"acc_norm\": 0.2127659574468085,\n \"acc_norm_stderr\": 0.0244146129743077\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24315514993481094,\n\
\ \"acc_stderr\": 0.010956556654417362,\n \"acc_norm\": 0.24315514993481094,\n\
\ \"acc_norm_stderr\": 0.010956556654417362\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2238562091503268,\n \"acc_stderr\": 0.016863008585416613,\n \
\ \"acc_norm\": 0.2238562091503268,\n \"acc_norm_stderr\": 0.016863008585416613\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3183673469387755,\n \"acc_stderr\": 0.029822533793982073,\n\
\ \"acc_norm\": 0.3183673469387755,\n \"acc_norm_stderr\": 0.029822533793982073\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.20481927710843373,\n\
\ \"acc_stderr\": 0.03141784291663926,\n \"acc_norm\": 0.20481927710843373,\n\
\ \"acc_norm_stderr\": 0.03141784291663926\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.17543859649122806,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.17543859649122806,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28886168910648713,\n\
\ \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": 0.4994069384927306,\n\
\ \"mc2_stderr\": 0.01625998465200045\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5169692186266772,\n \"acc_stderr\": 0.014044390401612976\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/monology/mixtral-soup
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|arc:challenge|25_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|gsm8k|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hellaswag|10_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T21-31-12.732008.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T21-31-12.732008.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- '**/details_harness|winogrande|5_2024-03-21T21-31-12.732008.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T21-31-12.732008.parquet'
- config_name: results
data_files:
- split: 2024_03_21T21_31_12.732008
path:
- results_2024-03-21T21-31-12.732008.parquet
- split: latest
path:
- results_2024-03-21T21-31-12.732008.parquet
---
# Dataset Card for Evaluation run of monology/mixtral-soup
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [monology/mixtral-soup](https://huggingface.co/monology/mixtral-soup) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_monology__mixtral-soup",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T21:31:12.732008](https://huggingface.co/datasets/open-llm-leaderboard/details_monology__mixtral-soup/blob/main/results_2024-03-21T21-31-12.732008.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2615539890670816,
"acc_stderr": 0.030988554116831494,
"acc_norm": 0.2622770504848914,
"acc_norm_stderr": 0.031810879126027564,
"mc1": 0.28886168910648713,
"mc1_stderr": 0.01586634640138431,
"mc2": 0.4994069384927306,
"mc2_stderr": 0.01625998465200045
},
"harness|arc:challenge|25": {
"acc": 0.21160409556313994,
"acc_stderr": 0.011935916358632847,
"acc_norm": 0.23976109215017063,
"acc_norm_stderr": 0.012476304127453935
},
"harness|hellaswag|10": {
"acc": 0.26239792869946227,
"acc_stderr": 0.0043903867754005324,
"acc_norm": 0.27076279625572597,
"acc_norm_stderr": 0.00443445671709759
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.21481481481481482,
"acc_stderr": 0.03547854198560827,
"acc_norm": 0.21481481481481482,
"acc_norm_stderr": 0.03547854198560827
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.025288394502891363,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.025288394502891363
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2774566473988439,
"acc_stderr": 0.03414014007044036,
"acc_norm": 0.2774566473988439,
"acc_norm_stderr": 0.03414014007044036
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.028020226271200217,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.028020226271200217
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2,
"acc_stderr": 0.03333333333333329,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03333333333333329
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276864,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276864
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.03161856335358609,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.03161856335358609
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.29015544041450775,
"acc_stderr": 0.03275264467791515,
"acc_norm": 0.29015544041450775,
"acc_norm_stderr": 0.03275264467791515
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.22564102564102564,
"acc_stderr": 0.021193632525148547,
"acc_norm": 0.22564102564102564,
"acc_norm_stderr": 0.021193632525148547
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.02564410863926764,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.02564410863926764
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3487394957983193,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.3487394957983193,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.30458715596330277,
"acc_stderr": 0.01973229942035404,
"acc_norm": 0.30458715596330277,
"acc_norm_stderr": 0.01973229942035404
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.02860951671699494,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.02860951671699494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.17937219730941703,
"acc_stderr": 0.025749819569192794,
"acc_norm": 0.17937219730941703,
"acc_norm_stderr": 0.025749819569192794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2900763358778626,
"acc_stderr": 0.03980066246467766,
"acc_norm": 0.2900763358778626,
"acc_norm_stderr": 0.03980066246467766
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2975206611570248,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.2975206611570248,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.033220157957767414,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.033220157957767414
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.19642857142857142,
"acc_stderr": 0.03770970049347018,
"acc_norm": 0.19642857142857142,
"acc_norm_stderr": 0.03770970049347018
},
"harness|hendrycksTest-management|5": {
"acc": 0.3300970873786408,
"acc_stderr": 0.046561471100123486,
"acc_norm": 0.3300970873786408,
"acc_norm_stderr": 0.046561471100123486
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2541507024265645,
"acc_stderr": 0.015569254692045778,
"acc_norm": 0.2541507024265645,
"acc_norm_stderr": 0.015569254692045778
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22254335260115607,
"acc_stderr": 0.02239421566194282,
"acc_norm": 0.22254335260115607,
"acc_norm_stderr": 0.02239421566194282
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24804469273743016,
"acc_stderr": 0.014444157808261427,
"acc_norm": 0.24804469273743016,
"acc_norm_stderr": 0.014444157808261427
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351294,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351294
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2508038585209003,
"acc_stderr": 0.024619771956697165,
"acc_norm": 0.2508038585209003,
"acc_norm_stderr": 0.024619771956697165
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2716049382716049,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.2716049382716049,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2127659574468085,
"acc_stderr": 0.0244146129743077,
"acc_norm": 0.2127659574468085,
"acc_norm_stderr": 0.0244146129743077
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24315514993481094,
"acc_stderr": 0.010956556654417362,
"acc_norm": 0.24315514993481094,
"acc_norm_stderr": 0.010956556654417362
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2238562091503268,
"acc_stderr": 0.016863008585416613,
"acc_norm": 0.2238562091503268,
"acc_norm_stderr": 0.016863008585416613
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3183673469387755,
"acc_stderr": 0.029822533793982073,
"acc_norm": 0.3183673469387755,
"acc_norm_stderr": 0.029822533793982073
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-virology|5": {
"acc": 0.20481927710843373,
"acc_stderr": 0.03141784291663926,
"acc_norm": 0.20481927710843373,
"acc_norm_stderr": 0.03141784291663926
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.17543859649122806,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.17543859649122806,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28886168910648713,
"mc1_stderr": 0.01586634640138431,
"mc2": 0.4994069384927306,
"mc2_stderr": 0.01625998465200045
},
"harness|winogrande|5": {
"acc": 0.5169692186266772,
"acc_stderr": 0.014044390401612976
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
autoevaluate/autoeval-eval-MicPie__QA_bias-v2_TEST-MicPie__QA_bias-v2_TEST-9d4c95-1678559331 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- MicPie/QA_bias-v2_TEST
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-2.7b_eval
metrics: []
dataset_name: MicPie/QA_bias-v2_TEST
dataset_config: MicPie--QA_bias-v2_TEST
dataset_split: train
col_mapping:
text: prompt
classes: classes
target: answer_index
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-2.7b_eval
* Dataset: MicPie/QA_bias-v2_TEST
* Config: MicPie--QA_bias-v2_TEST
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@MicPie](https://huggingface.co/MicPie) for evaluating this model. |
lordseidon/dear-friend-1k | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 2934825.0
num_examples: 23
download_size: 2936438
dataset_size: 2934825.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Jarmac/llama2_bhc_dataset_train_prompt | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 305881816
num_examples: 68785
download_size: 163719646
dataset_size: 305881816
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Polaculi/fer | ---
license: unknown
---
|
RustamovPY/test_dataset | ---
dataset_info:
features:
- name: voice
dtype: audio
- name: text
dtype: string
- name: speaker
dtype: string
splits:
- name: train
num_bytes: 1257942.0
num_examples: 3
download_size: 1227002
dataset_size: 1257942.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "test_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kalese/opus-mt-en-bkm | ---
task_categories:
- translation
size_categories:
- n<1K
--- |
TwoAbove/test-dalle-3 | ---
language:
- en
license:
- cc0-1.0
tags:
- image-text-dataset
- synthetic-dataset
dataset_info:
features:
- name: caption
dtype: string
- name: image
dtype: image
- name: link
dtype: string
- name: message_id
dtype: string
- name: timestamp
dtype: string
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
This is a test database. Please ignore. |
HuggingFaceM4/mini-GPT-captions | Invalid username or password. |
narad/ravdess | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- cc-by-nc-sa-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- audio-classification
task_ids:
- audio-emotion-recognition
---
# Dataset Card for RAVDESS
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
https://www.kaggle.com/datasets/uwrfkaggler/ravdess-emotional-speech-audio
- **Repository:**
- **Paper:**
https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0196391
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS)
Speech audio-only files (16bit, 48kHz .wav) from the RAVDESS. Full dataset of speech and song, audio and video (24.8 GB) available from Zenodo. Construction and perceptual validation of the RAVDESS is described in our Open Access paper in PLoS ONE.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
English
## Dataset Structure
The dataset repository contains only preprocessing scripts. When loaded and a cached version is not found, the dataset will be automatically downloaded and a .tsv file created with all data instances saved as rows in a table.
### Data Instances
[More Information Needed]
### Data Fields
- "audio": a datasets.Audio representation of the spoken utterance,
- "text": a datasets.Value string representation of spoken utterance,
- "labels": a datasets.ClassLabel representation of the emotion label,
- "speaker_id": a datasets.Value string representation of the speaker ID,
- "speaker_gender": a datasets.Value string representation of the speaker gender
### Data Splits
All data is in the train partition.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
Original Data from the Zenodo release of the RAVDESS Dataset:
Files
This portion of the RAVDESS contains 1440 files: 60 trials per actor x 24 actors = 1440. The RAVDESS contains 24 professional actors (12 female, 12 male), vocalizing two lexically-matched statements in a neutral North American accent. Speech emotions includes calm, happy, sad, angry, fearful, surprise, and disgust expressions. Each expression is produced at two levels of emotional intensity (normal, strong), with an additional neutral expression.
File naming convention
Each of the 1440 files has a unique filename. The filename consists of a 7-part numerical identifier (e.g., 03-01-06-01-02-01-12.wav). These identifiers define the stimulus characteristics:
Filename identifiers
Modality (01 = full-AV, 02 = video-only, 03 = audio-only).
Vocal channel (01 = speech, 02 = song).
Emotion (01 = neutral, 02 = calm, 03 = happy, 04 = sad, 05 = angry, 06 = fearful, 07 = disgust, 08 = surprised).
Emotional intensity (01 = normal, 02 = strong). NOTE: There is no strong intensity for the 'neutral' emotion.
Statement (01 = "Kids are talking by the door", 02 = "Dogs are sitting by the door").
Repetition (01 = 1st repetition, 02 = 2nd repetition).
Actor (01 to 24. Odd numbered actors are male, even numbered actors are female).
Filename example: 03-01-06-01-02-01-12.wav
Audio-only (03)
Speech (01)
Fearful (06)
Normal intensity (01)
Statement "dogs" (02)
1st Repetition (01)
12th Actor (12)
Female, as the actor ID number is even.
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
(CC BY-NC-SA 4.0)[https://creativecommons.org/licenses/by-nc-sa/4.0/]
### Citation Information
How to cite the RAVDESS
Academic citation
If you use the RAVDESS in an academic publication, please use the following citation: Livingstone SR, Russo FA (2018) The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS): A dynamic, multimodal set of facial and vocal expressions in North American English. PLoS ONE 13(5): e0196391. https://doi.org/10.1371/journal.pone.0196391.
All other attributions
If you use the RAVDESS in a form other than an academic publication, such as in a blog post, school project, or non-commercial product, please use the following attribution: "The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS)" by Livingstone & Russo is licensed under CC BY-NA-SC 4.0.
### Contributions
Thanks to [@narad](https://github.com/narad) for adding this dataset. |
CyberHarem/amami_haruka_theidolmster | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of amami_haruka/天海春香 (THE iDOLM@STER)
This is the dataset of amami_haruka/天海春香 (THE iDOLM@STER), containing 500 images and their tags.
The core tags of this character are `brown_hair, short_hair, green_eyes, ribbon, hair_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 592.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amami_haruka_theidolmster/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 360.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amami_haruka_theidolmster/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1169 | 747.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amami_haruka_theidolmster/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 529.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amami_haruka_theidolmster/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1169 | 1.01 GiB | [Download](https://huggingface.co/datasets/CyberHarem/amami_haruka_theidolmster/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/amami_haruka_theidolmster',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, choker, open_mouth, smile, solo, blush, hair_flower, sweat, closed_eyes, microphone |
| 1 | 6 |  |  |  |  |  | 1girl, blush, choker, hair_flower, open_mouth, skirt, solo, thighhighs, :d, looking_at_viewer, microphone, mismatched_legwear |
| 2 | 6 |  |  |  |  |  | 1girl, open_mouth, smile, solo, hair_bow, dress |
| 3 | 9 |  |  |  |  |  | 1girl, one_eye_closed, smile, solo, open_mouth, ;d, skirt, star_(symbol), v |
| 4 | 6 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, white_background, bow, open_mouth, red_ribbon, simple_background, plaid_skirt, short_sleeves, :d, bangs, blue_shirt, pleated_skirt, school_uniform |
| 5 | 7 |  |  |  |  |  | 1girl, neck_ribbon, red_ribbon, simple_background, white_background, bangs, long_sleeves, looking_at_viewer, open_mouth, pleated_skirt, school_uniform, :d, blush, hair_bow, solo, sweater_vest, white_shirt, blue_skirt, red_bow, collared_shirt |
| 6 | 10 |  |  |  |  |  | 1girl, solo, bangs, blush, cleavage, looking_at_viewer, medium_breasts, navel, open_mouth, white_bikini, collarbone, day, outdoors, blue_sky, cloud, ocean, water, :d, cowboy_shot, frilled_bikini, jewelry, wet |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | choker | open_mouth | smile | solo | blush | hair_flower | sweat | closed_eyes | microphone | skirt | thighhighs | :d | looking_at_viewer | mismatched_legwear | hair_bow | dress | one_eye_closed | ;d | star_(symbol) | v | white_background | bow | red_ribbon | simple_background | plaid_skirt | short_sleeves | bangs | blue_shirt | pleated_skirt | school_uniform | neck_ribbon | long_sleeves | sweater_vest | white_shirt | blue_skirt | red_bow | collared_shirt | cleavage | medium_breasts | navel | white_bikini | collarbone | day | outdoors | blue_sky | cloud | ocean | water | cowboy_shot | frilled_bikini | jewelry | wet |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-------------|:--------|:-------|:--------|:--------------|:--------|:--------------|:-------------|:--------|:-------------|:-----|:--------------------|:---------------------|:-----------|:--------|:-----------------|:-----|:----------------|:----|:-------------------|:------|:-------------|:--------------------|:--------------|:----------------|:--------|:-------------|:----------------|:-----------------|:--------------|:---------------|:---------------|:--------------|:-------------|:----------|:-----------------|:-----------|:-----------------|:--------|:---------------|:-------------|:------|:-----------|:-----------|:--------|:--------|:--------|:--------------|:-----------------|:----------|:------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | | X | X | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | X | X | X | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | | X | X | X | | | | | | X | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | | X | | X | X | | | | | | | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | | X | | X | X | | | | | | | X | X | | X | | | | | | X | | X | X | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 6 | 10 |  |  |  |  |  | X | | X | | X | X | | | | | | | X | X | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
vinidiol/swpc_test_llama2_Xk | ---
license: cc-by-nc-nd-4.0
---
|
Hamid1212/ExploreGB | ---
license: apache-2.0
---
|
deepset/stackoverflow-survey-2023-text-sql | ---
license: cc-by-4.0
task_categories:
- text-generation
language:
- en
size_categories:
- n<1K
---
# BIQA Text-to-SQL Dataset
The data is from the [Stack Overflow Developer Survey 2023](https://survey.stackoverflow.co/2023/).
Created with this [Notebook](https://colab.research.google.com/drive/12NUeRMsld0toXMSXKFMaQVAv58XwOAT1?usp=sharing); uses [this spreadsheet](https://docs.google.com/spreadsheets/d/1Xh_TgMbyitvtw08g0byEmBpkwDGZDdBYenthOzcK6qI/edit?usp=sharing) defining manual adjustments.
- `data/eval_set_multi_answers_res.json`: Question and query pairs as list of `SQLSample`s with possibly more than one valid SQL for a question. Also results included.
- `data/survey_results_normalized_v2.db`: The main sqlite db file.
The json file contains a list of `SQLSample` objects as defined:
```python
@dataclass
class SQLQuery:
query: str
results: Optional[list[tuple]] = None
@dataclass
class SQLSample:
question: str
labels: list[SQLQuery]
prediction: Optional[SQLQuery] = None
pred_eval: str = ""
comment: str = ""
```
Can be read in through the code from the [related repository](https://github.com/deepset-ai/biqa-llm). |
sngsfydy/aptos_test | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
splits:
- name: train
num_bytes: 1802932566.6624794
num_examples: 733
download_size: 1800938316
dataset_size: 1802932566.6624794
---
# Dataset Card for "aptos_dataset2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
result-kand2-sdxl-wuerst-karlo/ead57b12 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 163
num_examples: 10
download_size: 1347
dataset_size: 163
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ead57b12"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_SF-Foundation__TextBase-v0.2 | ---
pretty_name: Evaluation run of SF-Foundation/TextBase-v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SF-Foundation/TextBase-v0.2](https://huggingface.co/SF-Foundation/TextBase-v0.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SF-Foundation__TextBase-v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T10:49:47.422583](https://huggingface.co/datasets/open-llm-leaderboard/details_SF-Foundation__TextBase-v0.2/blob/main/results_2024-04-15T10-49-47.422583.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6477872383727521,\n\
\ \"acc_stderr\": 0.032222544304474854,\n \"acc_norm\": 0.6467665827432066,\n\
\ \"acc_norm_stderr\": 0.03290217364404172,\n \"mc1\": 0.6376988984088128,\n\
\ \"mc1_stderr\": 0.01682664689726226,\n \"mc2\": 0.7780333506353068,\n\
\ \"mc2_stderr\": 0.013795197050693505\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7244027303754266,\n \"acc_stderr\": 0.013057169655761838,\n\
\ \"acc_norm\": 0.7372013651877133,\n \"acc_norm_stderr\": 0.012862523175351333\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7206731726747659,\n\
\ \"acc_stderr\": 0.004477514681328156,\n \"acc_norm\": 0.8897629954192392,\n\
\ \"acc_norm_stderr\": 0.0031254487960063553\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\"\
: 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.02354079935872329,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.02354079935872329\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659806,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374307,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374307\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n\
\ \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n\
\ \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n\
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.032910995786157686,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.032910995786157686\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834838,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834838\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4011173184357542,\n\
\ \"acc_stderr\": 0.01639222189940707,\n \"acc_norm\": 0.4011173184357542,\n\
\ \"acc_norm_stderr\": 0.01639222189940707\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n\
\ \"acc_stderr\": 0.012749206007657476,\n \"acc_norm\": 0.47131681877444587,\n\
\ \"acc_norm_stderr\": 0.012749206007657476\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6376988984088128,\n\
\ \"mc1_stderr\": 0.01682664689726226,\n \"mc2\": 0.7780333506353068,\n\
\ \"mc2_stderr\": 0.013795197050693505\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8500394632991318,\n \"acc_stderr\": 0.010034394804580809\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6876421531463229,\n \
\ \"acc_stderr\": 0.012765850404191413\n }\n}\n```"
repo_url: https://huggingface.co/SF-Foundation/TextBase-v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|arc:challenge|25_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|arc:challenge|25_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|gsm8k|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|gsm8k|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hellaswag|10_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hellaswag|10_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T10-42-26.389102.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T10-49-47.422583.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T10-49-47.422583.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- '**/details_harness|winogrande|5_2024-04-15T10-42-26.389102.parquet'
- split: 2024_04_15T10_49_47.422583
path:
- '**/details_harness|winogrande|5_2024-04-15T10-49-47.422583.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T10-49-47.422583.parquet'
- config_name: results
data_files:
- split: 2024_04_15T10_42_26.389102
path:
- results_2024-04-15T10-42-26.389102.parquet
- split: 2024_04_15T10_49_47.422583
path:
- results_2024-04-15T10-49-47.422583.parquet
- split: latest
path:
- results_2024-04-15T10-49-47.422583.parquet
---
# Dataset Card for Evaluation run of SF-Foundation/TextBase-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SF-Foundation/TextBase-v0.2](https://huggingface.co/SF-Foundation/TextBase-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SF-Foundation__TextBase-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T10:49:47.422583](https://huggingface.co/datasets/open-llm-leaderboard/details_SF-Foundation__TextBase-v0.2/blob/main/results_2024-04-15T10-49-47.422583.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6477872383727521,
"acc_stderr": 0.032222544304474854,
"acc_norm": 0.6467665827432066,
"acc_norm_stderr": 0.03290217364404172,
"mc1": 0.6376988984088128,
"mc1_stderr": 0.01682664689726226,
"mc2": 0.7780333506353068,
"mc2_stderr": 0.013795197050693505
},
"harness|arc:challenge|25": {
"acc": 0.7244027303754266,
"acc_stderr": 0.013057169655761838,
"acc_norm": 0.7372013651877133,
"acc_norm_stderr": 0.012862523175351333
},
"harness|hellaswag|10": {
"acc": 0.7206731726747659,
"acc_stderr": 0.004477514681328156,
"acc_norm": 0.8897629954192392,
"acc_norm_stderr": 0.0031254487960063553
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.047240073523838876,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.047240073523838876
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.02354079935872329,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.02354079935872329
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083008,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083008
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659806,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374307,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374307
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834838,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834838
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4011173184357542,
"acc_stderr": 0.01639222189940707,
"acc_norm": 0.4011173184357542,
"acc_norm_stderr": 0.01639222189940707
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.012749206007657476,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.012749206007657476
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6376988984088128,
"mc1_stderr": 0.01682664689726226,
"mc2": 0.7780333506353068,
"mc2_stderr": 0.013795197050693505
},
"harness|winogrande|5": {
"acc": 0.8500394632991318,
"acc_stderr": 0.010034394804580809
},
"harness|gsm8k|5": {
"acc": 0.6876421531463229,
"acc_stderr": 0.012765850404191413
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
cemuluoglakci/hallucination_acceptance_agent_instruction_dataset | ---
dataset_info:
features:
- name: text
dtype: string
- name: acceptance_label
dtype: string
- name: isHypotheticalQuestion
dtype: int64
- name: IsHypotheticalTerm
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 12757487
num_examples: 2988
- name: validation
num_bytes: 4239105
num_examples: 996
- name: test
num_bytes: 4196645
num_examples: 996
download_size: 7405554
dataset_size: 21193237
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Nexdata/87871_Images_of_106_Facial_Landmarks_Annotation_Data_complicated_scenes | ---
license: cc-by-nc-nd-4.0
---
## Description
7,871 Images of 106 Facial Landmarks Annotation Data (complicated scenes),this dataset includes yellow race, black race, white race and Indian people. In order to be more challenging, the data includes multiple scenes, multiple poses, different ages, light conditions and complicated expressions. This data can be used for tasks such as face detection and face recognition.
For more details, please refer to the link: https://www.nexdata.ai/dataset/961?source=Huggingface
# Specifications
## Data size
87,871 images. There is only one face in an image
## Gender distribution
male: 37,268 images, female: 50,603 images
## Race distribution
56,325 images of yellow race, 15,625 images of white race, 4,492 images of black race, 11,432 images of Indian
## Age distribution:
baby: 3,848 images;teenager: 5,792 images; young: 64,935 images; midlife: 9,879 images; senior: 3,418 images
## Collecting environment
including indoor and outdoor scenes
## Data format
.jpg, .json
## Data diversity
multiple scenes, multiple poses, multiple ages, multiple light conditions and complicated expressions
## Annotation content
9 facial attributes, 106 facial landmarks
## Accuracy
annotation accuracies of facial attributes and landmarks are over 97%
# Licensing Information
Commercial License
|
DBQ/Farfetch.Product.prices.Singapore | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: Singapore - Farfetch - Product-level price list
tags:
- webscraping
- ecommerce
- Farfetch
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: int64
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 225892320
num_examples: 602976
download_size: 80912778
dataset_size: 225892320
---
# Farfetch web scraped data
## About the website
The **Ecommerce industry** in the Asia Pacific, particularly in **Singapore**, has experienced significant growth in recent years, largely attributed to the rapid digital transformation and the increasing internet penetration rate. The so-called "Lion City" has become a hub for technological advancements and digital innovations. Companies like **Farfetch** have taken advantage of this and established a solid presence, offering a myriad of luxury fashion products online. The dataset studied comprises **Ecommerce product-list page (PLP) data** on Farfetch in Singapore, demonstrating the reach and impact of such platforms in the blossoming regional market.
## Link to **dataset**
[Singapore - Farfetch - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Farfetch%20Product-prices%20Singapore/r/recZTFqL4hIx7jJnk)
|
open-llm-leaderboard/details_leveldevai__MBA-7B | ---
pretty_name: Evaluation run of leveldevai/MBA-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [leveldevai/MBA-7B](https://huggingface.co/leveldevai/MBA-7B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_leveldevai__MBA-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-19T09:07:51.198061](https://huggingface.co/datasets/open-llm-leaderboard/details_leveldevai__MBA-7B/blob/main/results_2024-01-19T09-07-51.198061.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6557516748328418,\n\
\ \"acc_stderr\": 0.03188082626311897,\n \"acc_norm\": 0.656045399940601,\n\
\ \"acc_norm_stderr\": 0.03253327964878977,\n \"mc1\": 0.45777233782129745,\n\
\ \"mc1_stderr\": 0.01744096571248212,\n \"mc2\": 0.6270987571451256,\n\
\ \"mc2_stderr\": 0.015280108431010799\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6629692832764505,\n \"acc_stderr\": 0.013813476652902274,\n\
\ \"acc_norm\": 0.6945392491467577,\n \"acc_norm_stderr\": 0.01346008047800251\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6919936267675761,\n\
\ \"acc_stderr\": 0.004607256752931883,\n \"acc_norm\": 0.8722366062537343,\n\
\ \"acc_norm_stderr\": 0.0033314391934060345\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7052023121387283,\n\
\ \"acc_stderr\": 0.03476599607516478,\n \"acc_norm\": 0.7052023121387283,\n\
\ \"acc_norm_stderr\": 0.03476599607516478\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944427,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944427\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.022755204959542946,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.022755204959542946\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033484,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033484\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887034,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887034\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931796,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931796\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8312236286919831,\n \"acc_stderr\": 0.024381406832586234,\n \
\ \"acc_norm\": 0.8312236286919831,\n \"acc_norm_stderr\": 0.024381406832586234\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n\
\ \"acc_stderr\": 0.013265346261323798,\n \"acc_norm\": 0.8352490421455939,\n\
\ \"acc_norm_stderr\": 0.013265346261323798\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069353,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069353\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.423463687150838,\n\
\ \"acc_stderr\": 0.0165254258987735,\n \"acc_norm\": 0.423463687150838,\n\
\ \"acc_norm_stderr\": 0.0165254258987735\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.02575586592263295,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.02575586592263295\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\
\ \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n\
\ \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.45777233782129745,\n\
\ \"mc1_stderr\": 0.01744096571248212,\n \"mc2\": 0.6270987571451256,\n\
\ \"mc2_stderr\": 0.015280108431010799\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8153117600631413,\n \"acc_stderr\": 0.01090597811215688\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.690674753601213,\n \
\ \"acc_stderr\": 0.012731710925078143\n }\n}\n```"
repo_url: https://huggingface.co/leveldevai/MBA-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|arc:challenge|25_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|gsm8k|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hellaswag|10_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T09-07-51.198061.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-19T09-07-51.198061.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- '**/details_harness|winogrande|5_2024-01-19T09-07-51.198061.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-19T09-07-51.198061.parquet'
- config_name: results
data_files:
- split: 2024_01_19T09_07_51.198061
path:
- results_2024-01-19T09-07-51.198061.parquet
- split: latest
path:
- results_2024-01-19T09-07-51.198061.parquet
---
# Dataset Card for Evaluation run of leveldevai/MBA-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [leveldevai/MBA-7B](https://huggingface.co/leveldevai/MBA-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_leveldevai__MBA-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-19T09:07:51.198061](https://huggingface.co/datasets/open-llm-leaderboard/details_leveldevai__MBA-7B/blob/main/results_2024-01-19T09-07-51.198061.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6557516748328418,
"acc_stderr": 0.03188082626311897,
"acc_norm": 0.656045399940601,
"acc_norm_stderr": 0.03253327964878977,
"mc1": 0.45777233782129745,
"mc1_stderr": 0.01744096571248212,
"mc2": 0.6270987571451256,
"mc2_stderr": 0.015280108431010799
},
"harness|arc:challenge|25": {
"acc": 0.6629692832764505,
"acc_stderr": 0.013813476652902274,
"acc_norm": 0.6945392491467577,
"acc_norm_stderr": 0.01346008047800251
},
"harness|hellaswag|10": {
"acc": 0.6919936267675761,
"acc_stderr": 0.004607256752931883,
"acc_norm": 0.8722366062537343,
"acc_norm_stderr": 0.0033314391934060345
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.03476599607516478,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.03476599607516478
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944427,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944427
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8,
"acc_stderr": 0.022755204959542946,
"acc_norm": 0.8,
"acc_norm_stderr": 0.022755204959542946
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479049,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479049
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033484,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033484
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131147,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131147
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887034,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887034
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931796,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931796
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8312236286919831,
"acc_stderr": 0.024381406832586234,
"acc_norm": 0.8312236286919831,
"acc_norm_stderr": 0.024381406832586234
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323798,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323798
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069353,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069353
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.423463687150838,
"acc_stderr": 0.0165254258987735,
"acc_norm": 0.423463687150838,
"acc_norm_stderr": 0.0165254258987735
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.02575586592263295,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.02575586592263295
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.45777233782129745,
"mc1_stderr": 0.01744096571248212,
"mc2": 0.6270987571451256,
"mc2_stderr": 0.015280108431010799
},
"harness|winogrande|5": {
"acc": 0.8153117600631413,
"acc_stderr": 0.01090597811215688
},
"harness|gsm8k|5": {
"acc": 0.690674753601213,
"acc_stderr": 0.012731710925078143
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
autoevaluate/autoeval-staging-eval-project-xsum-4ce7da77-12905722 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: google/bigbird-pegasus-large-arxiv
metrics: []
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/bigbird-pegasus-large-arxiv
* Dataset: xsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@grapplerulrich](https://huggingface.co/grapplerulrich) for evaluating this model. |
silk-road/Chat_Suzumiya_Fusion_B | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: context
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 148676547
num_examples: 37907
download_size: 44576110
dataset_size: 148676547
---
# Dataset Card for "Chat_Suzumiya_Fusion_B"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_aloobun__Reyna-CoT-4B-v0.1 | ---
pretty_name: Evaluation run of aloobun/Reyna-CoT-4B-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [aloobun/Reyna-CoT-4B-v0.1](https://huggingface.co/aloobun/Reyna-CoT-4B-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aloobun__Reyna-CoT-4B-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-23T05:19:01.726873](https://huggingface.co/datasets/open-llm-leaderboard/details_aloobun__Reyna-CoT-4B-v0.1/blob/main/results_2024-02-23T05-19-01.726873.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5513796437004328,\n\
\ \"acc_stderr\": 0.03390923169391072,\n \"acc_norm\": 0.5596390174622787,\n\
\ \"acc_norm_stderr\": 0.03465607779320046,\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.4308604951201183,\n\
\ \"mc2_stderr\": 0.01407704178265183\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4069965870307167,\n \"acc_stderr\": 0.014356399418009126,\n\
\ \"acc_norm\": 0.447098976109215,\n \"acc_norm_stderr\": 0.014529380160526847\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5205138418641705,\n\
\ \"acc_stderr\": 0.004985580065946455,\n \"acc_norm\": 0.7112129057956582,\n\
\ \"acc_norm_stderr\": 0.004522725412556969\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.04026097083296564,\n\
\ \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.04026097083296564\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.02977308271331987,\n\
\ \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.02977308271331987\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5208333333333334,\n\
\ \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.5208333333333334,\n\
\ \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n\
\ \"acc_stderr\": 0.03784271932887467,\n \"acc_norm\": 0.5606936416184971,\n\
\ \"acc_norm_stderr\": 0.03784271932887467\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.032650194750335815,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.032650194750335815\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.041657747757287644,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.041657747757287644\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4523809523809524,\n \"acc_stderr\": 0.02563425811555496,\n \"\
acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.02563425811555496\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6483870967741936,\n\
\ \"acc_stderr\": 0.027162537826948458,\n \"acc_norm\": 0.6483870967741936,\n\
\ \"acc_norm_stderr\": 0.027162537826948458\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.031353050095330855,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.031353050095330855\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7668393782383419,\n \"acc_stderr\": 0.030516111371476008,\n\
\ \"acc_norm\": 0.7668393782383419,\n \"acc_norm_stderr\": 0.030516111371476008\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.025069094387296532,\n\
\ \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.025069094387296532\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.02931820364520686,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.02931820364520686\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.0322529423239964,\n \
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.728440366972477,\n \"acc_stderr\": 0.01906909836319144,\n \"acc_norm\"\
: 0.728440366972477,\n \"acc_norm_stderr\": 0.01906909836319144\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.46296296296296297,\n\
\ \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.46296296296296297,\n\
\ \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7009803921568627,\n \"acc_stderr\": 0.03213325717373616,\n\
\ \"acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.03213325717373616\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.70042194092827,\n \"acc_stderr\": 0.029818024749753088,\n \
\ \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.029818024749753088\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.042059539338841226,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.042059539338841226\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.04453197507374983,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.04453197507374983\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6257668711656442,\n \"acc_stderr\": 0.03802068102899615,\n\
\ \"acc_norm\": 0.6257668711656442,\n \"acc_norm_stderr\": 0.03802068102899615\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384493,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384493\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.024414947304543688,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.024414947304543688\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7215836526181354,\n\
\ \"acc_stderr\": 0.016028295188992455,\n \"acc_norm\": 0.7215836526181354,\n\
\ \"acc_norm_stderr\": 0.016028295188992455\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977254,\n\
\ \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977254\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2849162011173184,\n\
\ \"acc_stderr\": 0.015096222302469806,\n \"acc_norm\": 0.2849162011173184,\n\
\ \"acc_norm_stderr\": 0.015096222302469806\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.02773283435336394,\n\
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.02773283435336394\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6205787781350482,\n\
\ \"acc_stderr\": 0.02755994980234782,\n \"acc_norm\": 0.6205787781350482,\n\
\ \"acc_norm_stderr\": 0.02755994980234782\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5679012345679012,\n \"acc_stderr\": 0.027563010971606676,\n\
\ \"acc_norm\": 0.5679012345679012,\n \"acc_norm_stderr\": 0.027563010971606676\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38461538461538464,\n\
\ \"acc_stderr\": 0.012425548416302943,\n \"acc_norm\": 0.38461538461538464,\n\
\ \"acc_norm_stderr\": 0.012425548416302943\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5514705882352942,\n \"acc_stderr\": 0.0302114796091216,\n\
\ \"acc_norm\": 0.5514705882352942,\n \"acc_norm_stderr\": 0.0302114796091216\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5359477124183006,\n \"acc_stderr\": 0.02017548876548404,\n \
\ \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.02017548876548404\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.04653429807913507,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.04653429807913507\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6489795918367347,\n \"acc_stderr\": 0.030555316755573637,\n\
\ \"acc_norm\": 0.6489795918367347,\n \"acc_norm_stderr\": 0.030555316755573637\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.030965903123573023,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.030965903123573023\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.0340105262010409,\n\
\ \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.0340105262010409\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.4308604951201183,\n\
\ \"mc2_stderr\": 0.01407704178265183\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6771902131018153,\n \"acc_stderr\": 0.013140498173357943\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.16906747536012132,\n \
\ \"acc_stderr\": 0.010324171445497354\n }\n}\n```"
repo_url: https://huggingface.co/aloobun/Reyna-CoT-4B-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|arc:challenge|25_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|gsm8k|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hellaswag|10_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T05-19-01.726873.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-23T05-19-01.726873.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- '**/details_harness|winogrande|5_2024-02-23T05-19-01.726873.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-23T05-19-01.726873.parquet'
- config_name: results
data_files:
- split: 2024_02_23T05_19_01.726873
path:
- results_2024-02-23T05-19-01.726873.parquet
- split: latest
path:
- results_2024-02-23T05-19-01.726873.parquet
---
# Dataset Card for Evaluation run of aloobun/Reyna-CoT-4B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [aloobun/Reyna-CoT-4B-v0.1](https://huggingface.co/aloobun/Reyna-CoT-4B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_aloobun__Reyna-CoT-4B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-23T05:19:01.726873](https://huggingface.co/datasets/open-llm-leaderboard/details_aloobun__Reyna-CoT-4B-v0.1/blob/main/results_2024-02-23T05-19-01.726873.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5513796437004328,
"acc_stderr": 0.03390923169391072,
"acc_norm": 0.5596390174622787,
"acc_norm_stderr": 0.03465607779320046,
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.4308604951201183,
"mc2_stderr": 0.01407704178265183
},
"harness|arc:challenge|25": {
"acc": 0.4069965870307167,
"acc_stderr": 0.014356399418009126,
"acc_norm": 0.447098976109215,
"acc_norm_stderr": 0.014529380160526847
},
"harness|hellaswag|10": {
"acc": 0.5205138418641705,
"acc_stderr": 0.004985580065946455,
"acc_norm": 0.7112129057956582,
"acc_norm_stderr": 0.004522725412556969
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5723684210526315,
"acc_stderr": 0.04026097083296564,
"acc_norm": 0.5723684210526315,
"acc_norm_stderr": 0.04026097083296564
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.02977308271331987,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.02977308271331987
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5208333333333334,
"acc_stderr": 0.041775789507399935,
"acc_norm": 0.5208333333333334,
"acc_norm_stderr": 0.041775789507399935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.03784271932887467,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.03784271932887467
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.041657747757287644,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.041657747757287644
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.02563425811555496,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.02563425811555496
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6483870967741936,
"acc_stderr": 0.027162537826948458,
"acc_norm": 0.6483870967741936,
"acc_norm_stderr": 0.027162537826948458
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.031353050095330855,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.031353050095330855
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7668393782383419,
"acc_stderr": 0.030516111371476008,
"acc_norm": 0.7668393782383419,
"acc_norm_stderr": 0.030516111371476008
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5743589743589743,
"acc_stderr": 0.025069094387296532,
"acc_norm": 0.5743589743589743,
"acc_norm_stderr": 0.025069094387296532
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.02931820364520686,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.02931820364520686
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.728440366972477,
"acc_stderr": 0.01906909836319144,
"acc_norm": 0.728440366972477,
"acc_norm_stderr": 0.01906909836319144
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.03213325717373616,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.03213325717373616
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.70042194092827,
"acc_stderr": 0.029818024749753088,
"acc_norm": 0.70042194092827,
"acc_norm_stderr": 0.029818024749753088
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.042059539338841226,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.042059539338841226
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.04453197507374983,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.04453197507374983
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6257668711656442,
"acc_stderr": 0.03802068102899615,
"acc_norm": 0.6257668711656442,
"acc_norm_stderr": 0.03802068102899615
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384493,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384493
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543688,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543688
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7215836526181354,
"acc_stderr": 0.016028295188992455,
"acc_norm": 0.7215836526181354,
"acc_norm_stderr": 0.016028295188992455
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977254,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977254
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2849162011173184,
"acc_stderr": 0.015096222302469806,
"acc_norm": 0.2849162011173184,
"acc_norm_stderr": 0.015096222302469806
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.02773283435336394,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.02773283435336394
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6205787781350482,
"acc_stderr": 0.02755994980234782,
"acc_norm": 0.6205787781350482,
"acc_norm_stderr": 0.02755994980234782
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5679012345679012,
"acc_stderr": 0.027563010971606676,
"acc_norm": 0.5679012345679012,
"acc_norm_stderr": 0.027563010971606676
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38461538461538464,
"acc_stderr": 0.012425548416302943,
"acc_norm": 0.38461538461538464,
"acc_norm_stderr": 0.012425548416302943
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5514705882352942,
"acc_stderr": 0.0302114796091216,
"acc_norm": 0.5514705882352942,
"acc_norm_stderr": 0.0302114796091216
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5359477124183006,
"acc_stderr": 0.02017548876548404,
"acc_norm": 0.5359477124183006,
"acc_norm_stderr": 0.02017548876548404
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.04653429807913507,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.04653429807913507
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6489795918367347,
"acc_stderr": 0.030555316755573637,
"acc_norm": 0.6489795918367347,
"acc_norm_stderr": 0.030555316755573637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.030965903123573023,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.030965903123573023
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7309941520467836,
"acc_stderr": 0.0340105262010409,
"acc_norm": 0.7309941520467836,
"acc_norm_stderr": 0.0340105262010409
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.4308604951201183,
"mc2_stderr": 0.01407704178265183
},
"harness|winogrande|5": {
"acc": 0.6771902131018153,
"acc_stderr": 0.013140498173357943
},
"harness|gsm8k|5": {
"acc": 0.16906747536012132,
"acc_stderr": 0.010324171445497354
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AlexxxSem/quest-12-classes | ---
license: mit
language:
- en
size_categories:
- 1K<n<10K
task_categories:
- text-classification
--- |
sirCamp/trivia_as2_pairs_and_triplets | ---
dataset_info:
features:
- name: texts
sequence: string
splits:
- name: pairs
num_bytes: 29812945
num_examples: 115956
- name: triplets
num_bytes: 29812945
num_examples: 115956
download_size: 26265754
dataset_size: 59625890
---
# Dataset Card for "trivia_as2_pairs_and_triplets"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Dahoas/4096_filtered_base_code_review | ---
dataset_info:
features:
- name: body
dtype: string
- name: comments
list:
- name: ContentLicense
dtype: string
- name: CreationDate
dtype: string
- name: Id
dtype: string
- name: Score
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: body
dtype: string
- name: comments
list:
- name: ContentLicense
dtype: string
- name: CreationDate
dtype: string
- name: Id
dtype: string
- name: Score
dtype: string
- name: body
dtype: string
- name: meta_data
struct:
- name: CommentCount
dtype: string
- name: ContentLicense
dtype: string
- name: CreationDate
dtype: string
- name: Id
dtype: string
- name: ParentId
dtype: string
- name: Score
dtype: string
- name: meta_data
struct:
- name: AcceptedAnswerId
dtype: string
- name: CommentCount
dtype: string
- name: ContentLicense
dtype: string
- name: CreationDate
dtype: string
- name: Id
dtype: string
- name: Score
dtype: string
- name: Tags
sequence: string
- name: Title
dtype: string
- name: question_id
dtype: string
splits:
- name: train
num_bytes: 206395804
num_examples: 37026
download_size: 106795288
dataset_size: 206395804
---
# Dataset Card for "4096_filtered_base_code_review"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
StevenQ88/testDataset | ---
license: mit
---
|
bentrevett/multi30k | ---
task_categories:
- translation
language:
- en
- de
size_categories:
- 10K<n<100K
---
# Multi30k
This dataset contains the "multi30k" dataset, which is the "task 1" dataset from [here](https://www.statmt.org/wmt16/multimodal-task.html).
Each example consists of an "en" and a "de" feature. "en" is an English sentence, and "de" is the German translation of the English sentence.
### Data Splits
The Multi30k dataset has 3 splits: _train_, _validation_, and _test_.
| Dataset Split | Number of Instances in Split |
| ------------- | ------------------------------------------- |
| Train | 29,000 |
| Validation | 1,014 |
| Test | 1,000 |
### Citation Information
```
@article{elliott-EtAl:2016:VL16,
author = {{Elliott}, D. and {Frank}, S. and {Sima'an}, K. and {Specia}, L.},
title = {Multi30K: Multilingual English-German Image Descriptions},
booktitle = {Proceedings of the 5th Workshop on Vision and Language},
year = {2016},
pages = {70--74},
year = 2016
}
``` |
kgr123/quality_counter_4500_4_buckets | ---
dataset_info:
features:
- name: context
dtype: string
- name: word
dtype: string
- name: claim
dtype: string
- name: label
dtype: int64
splits:
- name: test
num_bytes: 24695197
num_examples: 1929
- name: train
num_bytes: 24495183
num_examples: 1935
- name: validation
num_bytes: 24980990
num_examples: 1941
download_size: 16330585
dataset_size: 74171370
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
ParsifalBR/LOCUTOR | ---
license: openrail
---
|
liuyanchen1015/MULTI_VALUE_rte_comparative_as_to | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 15337
num_examples: 32
- name: train
num_bytes: 15304
num_examples: 33
download_size: 30090
dataset_size: 30641
---
# Dataset Card for "MULTI_VALUE_rte_comparative_as_to"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
blastwind/basic_monads | ---
dataset_info:
features:
- name: monad
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 3284
num_examples: 7
download_size: 5652
dataset_size: 3284
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
daze-unlv/medmcqa | ---
license: apache-2.0
---
|
adityaConnect77/bm | ---
license: apache-2.0
---
|
LukeXYZ/Tsuo | ---
license: openrail
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/b8d275dd | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 180
num_examples: 10
download_size: 1340
dataset_size: 180
---
# Dataset Card for "b8d275dd"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vabedfa/nnatchaploy | ---
license: bigcode-openrail-m
---
|
SEACrowd/indolem_sentiment | ---
tags:
- sentiment-analysis
language:
- ind
---
# indolem_sentiment
IndoLEM (Indonesian Language Evaluation Montage) is a comprehensive Indonesian benchmark that comprises of seven tasks for the Indonesian language. This benchmark is categorized into three pillars of NLP tasks: morpho-syntax, semantics, and discourse.
This dataset is based on binary classification (positive and negative), with distribution:
* Train: 3638 sentences
* Development: 399 sentences
* Test: 1011 sentences
The data is sourced from 1) Twitter [(Koto and Rahmaningtyas, 2017)](https://www.researchgate.net/publication/321757985_InSet_Lexicon_Evaluation_of_a_Word_List_for_Indonesian_Sentiment_Analysis_in_Microblogs)
and 2) [hotel reviews](https://github.com/annisanurulazhar/absa-playground/).
The experiment is based on 5-fold cross validation.
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@article{DBLP:journals/corr/abs-2011-00677,
author = {Fajri Koto and
Afshin Rahimi and
Jey Han Lau and
Timothy Baldwin},
title = {IndoLEM and IndoBERT: {A} Benchmark Dataset and Pre-trained Language
Model for Indonesian {NLP}},
journal = {CoRR},
volume = {abs/2011.00677},
year = {2020},
url = {https://arxiv.org/abs/2011.00677},
eprinttype = {arXiv},
eprint = {2011.00677},
timestamp = {Fri, 06 Nov 2020 15:32:47 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-2011-00677.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
## License
Creative Commons Attribution Share-Alike 4.0 International
## Homepage
[https://indolem.github.io/](https://indolem.github.io/)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
kaleemWaheed/twitter_dataset_1713138547 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 13116
num_examples: 32
download_size: 10247
dataset_size: 13116
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_141 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1055383196.0
num_examples: 207263
download_size: 1074897618
dataset_size: 1055383196.0
---
# Dataset Card for "chunk_141"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mihaien/my-full-dataset-1024 | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 598398078.152
num_examples: 6476
download_size: 626373400
dataset_size: 598398078.152
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/pennsylvania_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of pennsylvania/ペンシルベニア/宾夕法尼亚 (Azur Lane)
This is the dataset of pennsylvania/ペンシルベニア/宾夕法尼亚 (Azur Lane), containing 10 images and their tags.
The core tags of this character are `long_hair, green_eyes, brown_hair, breasts, ponytail, large_breasts, hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 13.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pennsylvania_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 7.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pennsylvania_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 20 | 13.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pennsylvania_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 11.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pennsylvania_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 20 | 19.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pennsylvania_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/pennsylvania_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, solo, pantyhose, simple_background, white_background, black_gloves, cleavage, looking_at_viewer, blush, uniform |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | pantyhose | simple_background | white_background | black_gloves | cleavage | looking_at_viewer | blush | uniform |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:------------|:--------------------|:-------------------|:---------------|:-----------|:--------------------|:--------|:----------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X |
|
davidgaofc/RM_inout_bal_train | ---
dataset_info:
features:
- name: Text
dtype: string
- name: Label
dtype: int64
splits:
- name: train
num_bytes: 439306.3841463415
num_examples: 910
download_size: 194959
dataset_size: 439306.3841463415
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
senhorsapo/barbabranca | ---
license: openrail
---
|
UCTL8LLKEGXlXqDLVAOLDNnA/VanishmentThisWorld | ---
viewer: false
---
 |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/b5cf07b3 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 186
num_examples: 10
download_size: 1325
dataset_size: 186
---
# Dataset Card for "b5cf07b3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kewu93/three_styles_prompted_500 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 34576478.8
num_examples: 1200
- name: val
num_bytes: 8468533.6
num_examples: 300
download_size: 42069788
dataset_size: 43045012.4
---
# Dataset Card for "three_styles_prompted_500"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
purifesquer/tc | ---
license: openrail
---
|
Isaac-Seungwon/llama2_custom_code | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7786
num_examples: 32
download_size: 4172
dataset_size: 7786
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
naorm/website-screenshots-git-large | ---
language:
- en
dataset_info:
features:
- name: image
dtype: image
- name: index
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 151888174.776
num_examples: 1688
- name: validation
num_bytes: 44114537.0
num_examples: 484
- name: test
num_bytes: 22282271.0
num_examples: 242
download_size: 56762629
dataset_size: 218284982.776
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
marup/GlamrockChicaRVC400Epochs | ---
license: openrail
---
|
Nexdata/Thai_Speech_Data_by_Mobile_Phone_Reading | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Thai_Speech_Data_by_Mobile_Phone_Reading
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/69?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Thai speech data (reading) is collected from 498 Thailand native speakers and is recorded in quiet environment. The recording is rich in content, covering multiple categories such as econimics, entertainment, news, figure, and oral. Around 400 sentences for each speaker. The valid data volumn is 292 hours. All texts are manual transcribed with high accuray.
For more details, please refer to the link: https://www.nexdata.ai/datasets/69?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Thai
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
tasksource/SpaceNLI | ---
dataset_info:
features:
- name: id
dtype: string
- name: label
dtype: string
- name: src
dtype: string
- name: cat
dtype: string
- name: exp
dtype: string
- name: ent_type
dtype: string
- name: prem_num
dtype: int64
- name: premises
dtype: string
- name: hypothesis
dtype: string
- name: subs
struct:
- name: NP0
dtype: string
- name: NP1
dtype: string
- name: NP2
dtype: string
- name: NP3
dtype: string
- name: NP4
dtype: string
- name: _at_least
dtype: string
- name: immediately_r_01
dtype: string
splits:
- name: train
num_bytes: 7276049
num_examples: 32000
download_size: 1027566
dataset_size: 7276049
license: mit
---
# Dataset Card for "SpaceNLI"
https://github.com/kovvalsky/SpaceNLI/tree/main
```
@misc{abzianidze2023spacenli,
title={SpaceNLI: Evaluating the Consistency of Predicting Inferences in Space},
author={Lasha Abzianidze and Joost Zwarts and Yoad Winter},
year={2023},
eprint={2307.02269},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
classla/hr500k | ---
language:
- hr
license:
- cc-by-sa-4.0
task_categories:
- other
task_ids:
- lemmatization
- named-entity-recognition
- part-of-speech
tags:
- structure-prediction
- normalization
- tokenization
---
The hr500k training corpus contains 506,457 Croatian tokens manually annotated on the levels of tokenisation, sentence segmentation, morphosyntactic tagging, lemmatisation, named entities and dependency syntax.
On the sentence level, the dataset contains 20159 training samples, 1963 validation samples and 2672 test samples
across the respective data splits. Each sample represents a sentence and includes the following features:
sentence ID ('sent\_id'), sentence text ('text'), list of tokens ('tokens'), list of lemmas ('lemmas'),
list of MULTEXT-East tags ('xpos\_tags), list of UPOS tags ('upos\_tags'), list of morphological features ('feats'),
and list of IOB tags ('iob\_tags'). A subset of the data also contains universal dependencies ('ud') and consists of
7498 training samples, 649 validation samples, and 742 test samples.
Three dataset configurations are available, namely 'ner', 'upos', and 'ud', with the corresponding features
encoded as class labels. If the configuration is not specified, it defaults to 'ner'.
If you use this dataset in your research, please cite the following paper:
```
Bibtex @InProceedings{LJUBEI16.340,
author = {Nikola Ljubešić and Filip Klubička and Željko Agić and Ivo-Pavao Jazbec},
title = {New Inflectional Lexicons and Training Corpora for Improved Morphosyntactic Annotation of Croatian and Serbian},
booktitle = {Proceedings of the Tenth International Conference on Language Resources and Evaluation (LREC 2016)},
year = {2016},
month = {may},
date = {23-28},
location = {Portorož, Slovenia},
editor = {Nicoletta Calzolari (Conference Chair) and Khalid Choukri and Thierry Declerck and Sara Goggi and Marko Grobelnik and Bente Maegaard and Joseph Mariani and Helene Mazo and Asuncion Moreno and Jan Odijk and Stelios Piperidis},
publisher = {European Language Resources Association (ELRA)},
address = {Paris, France},
isbn = {978-2-9517408-9-1},
language = {english}
}
``` |
autoevaluate/autoeval-eval-futin__feed-sen_en_-1de085-2240171544 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/feed
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-560m
metrics: []
dataset_name: futin/feed
dataset_config: sen_en_
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-560m
* Dataset: futin/feed
* Config: sen_en_
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
emilianJR/ftinder_more | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1484550.0
num_examples: 59
download_size: 1480191
dataset_size: 1484550.0
---
# Dataset Card for "ftinder_more"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pallavi176/resume_dataset | ---
dataset_info:
features:
- name: resume_str
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 12496606
num_examples: 1987
- name: test
num_bytes: 1631312
num_examples: 248
- name: validation
num_bytes: 1604207
num_examples: 249
download_size: 7940604
dataset_size: 15732125
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
torileatherman/sentiment_analysis_batch | ---
dataset_info:
features:
- name: Headline
sequence: int64
- name: Url
dtype: string
- name: Headline_string
dtype: string
splits:
- name: train
num_bytes: 5984
num_examples: 10
download_size: 3050
dataset_size: 5984
---
# Dataset Card for "sentiment_analysis_batch"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Laethitia/GaaraIA | ---
license: openrail
---
|
Nexdata/Multi-class_Fashion_Item_Detection_Data | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Multi-class_Fashion_Item_Detection_Data
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/1057?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
144,810 Images Multi-class Fashion Item Detection Data. In this dataset, 19,968 images of male and 124,842 images of female were included. The Fashion Items were divided into 4 parts based on the season (spring, autumn, summer and winter). In terms of annotation, rectangular bounding boxes were adopted to annotate fashion items. The data can be used for tasks such as fashion items detection, fashion recommendation and other tasks.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1057?source=Huggingface
### Supported Tasks and Leaderboards
object-detection, computer-vision: The dataset can be used to train a model for object detection.
### Languages
English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.